What are some of the applications and use cases? #askIoT
If you just include “machine learning” in your pitch you can add a zero on to the end of your valuation.
Machine Learning (ML) and the Internet of Things (IoT) are huge buzzwords right now, and they’re both near the peak of the hype cycle. The above quote came somewhat jokingly from an investor, but it has some truth to it too.
Gartner’s 2016 Hype Cycle for Emerging Technologies — Machine Learning is at the very peak of the hype cycle, with IoT Platform and other related IoT technologies on the up-slope.
Given all the hype and buzz around machine learning and IoT, it can be difficult to cut through the noise and understand where the actual value lies. In this week’s #askIoT post, I’ll explain how machine learning can be valuable for IoT, when it’s appropriate to use, and some applications and use cases currently out in the world today.
Data Analytics vs. Machine Learning
With all the aforementioned hype around machine learning, many organizations are asking if they need to be using machine learning in their business somehow.
In the vast majority of cases, the answer is a resounding no.
Later I’ll explore the value of machine learning in greater depth, but at a high level, machine learning takes large amounts of data and generates useful insights that help the organization. That could mean improving processes, cutting costs, creating a better experience for the customer, or opening up new business models.
The thing is, most organizations can get many of these benefits from traditional data analytics, without the need for more complicated machine learning approaches.
Traditional data analysis is great at explaining data. You can generate reports or models of what happened in the past or of what’s happening today, drawing useful insights to apply to the organization.
Data analytics can help quantify and track goals, enable smarter decision making, and then provide the means for measuring success over time.
So When Is Machine Learning Valuable?
The data models that are typical of traditional data analytics are often static and of limited use in addressing fast-changing and unstructured data. When it comes to IoT, it’s often necessary to identify correlations between dozens of sensor inputs and external factors that are rapidly producing millions of data points.
While traditional data analysis would need a model built on past data and expert opinion to establish a relationship between the variables, machine learning starts with the outcome variables (e.g. saving energy) and then automatically looks for predictor variables and their interactions.
In general, machine learning is valuable when you know what you want but you don’t know the important input variables to make that decision. So you give the machine learning algorithm the goal(s) and then it “learns” from the data which factors are important in achieving that goal.
A great example is Google’s application of machine learning to its data centers last year. Data centers need to remain cool, so they require vast amounts of energy for their cooling systems to function properly (or you could just dunk them in the ocean). This represents a significant cost to Google, so the goal was to increase efficiency with machine learning.
With 120 variables affecting the cooling system (i.e. fans, pumps speeds, windows, etc.), building a model with classic approaches would be a huge undertaking. Instead, Google applied machine learning and cut it’s overall energy consumption by 15%. That represents hundreds of millions of dollars in savings for Google in the coming years.
In addition, machine learning is also valuable for accurately predicting future events. Whereas the data models built using traditional data analytics are static, machine learning algorithms constantly improve over time as more data is captured and assimilated. This means that the machine learning algorithm can make predictions, see what actually happens, compare against its predictions, then adjust to become more accurate.
The predictive analytics made possible by machine learning are hugely valuable for many IoT applications. Let’s take a look at a few concrete examples…
Applications in IoT
Cost Savings in Industrial Applications
Predictive capabilities are extremely useful in an industrial setting. By drawing data from multiple sensors in or on machines, machine learning algorithms can “learn” what’s typical for the machine and then detect when something abnormal begins to occur.
A company called Augury does exactly this with vibration and ultrasonic sensors installed on equipment:
“The collected data is sent to our servers, where it is compared with previous data collected from that machine, as well as data collected from similar machines. Our platform can detect the slightest changes and warn you of developing malfunctions. This analysis is done in real-time and the results are displayed on the technician’s smartphone within seconds.”
Predicting when a machine needs maintenance is incredibly valuable, translating into millions of dollars in saved costs. A great example is Goldcorp, a mining company that uses immense vehicles to haul away materials.
When these hauling vehicles break down, it costs Goldcorp $2 million per day in lost productivity. Goldcorp is now using machine learning to predict with over 90% accuracy when machines will need maintenance, meaning huge cost savings.
Shaping Experiences to Individuals
We’re actually all familiar with machine learning in our everyday lives. Both Amazon and Netflix use machine learning to learn our preferences and provide a better experience for the user. That could mean suggesting products that you might like or providing relevant recommendations for movies and TV shows.
Similarly, in IoT machine learning can be extremely valuable in shaping our environment to our personal preferences.
The Nest Thermostat is a great example, it uses machine learning to learn your preferences for heating and cooling, making sure that the house is the right temperature when you get home from work or when you wake up in the morning.
The use cases described above are just a few of the virtually infinite possibilities, but they’re important because they’re useful applications of machine learning in IoT that are happening right now.
We’re Just Scratching the Surface
The billions of sensors and devices that will continue to be connected to the internet in the coming years will generate exponentially more data. As I discussed in my last post, this huge increase in data will drive great improvements in machine learning, opening countless opportunities for us to reap the benefits.
Not only will we be able to predict when machines need maintenance, we’ll be able to predict when we need maintenance too. Machine learning will be applied to the data from our wearables to learn our baseline and determine when our vitals have become abnormal, calling a doctor or ambulance automatically if necessary.
Beyond individuals, we’ll be able to use that health data at scale to see trends across entire populations, predicting outbreaks of disease and proactively addressing health problems.
We’ll also be able to predict accidents and crime before they even happen. Data from noise sensors, video cameras, even smart trash-bins in Smart Cities can be fed into machine learning algorithms to discover the preconditions for accidents or crime, equipping law enforcement with powerful tools (of course, there are some privacy concerns).
Although both machine learning and IoT are at the height of hype, the future applications and possibilities are worthy of that hype. We’re really just scratching the surface of what’s possible.
Still have IoT questions?
Every week I’ll write a new Medium post explaining an IoT concept in simple, non-technical terms. If you have any concepts you’d like explained in a post, please let me know in the comments below! Or if you have a short question you’d like answered quickly, just tweet us using #askIoT!
Disclaimer: This is a curated post. The statements, opinions and data contained in these publications are solely those of the individual authors and contributors and not of iamwire or its editor(s). This article was originally published by the author here.