This column is by Leading Business and Data Expert, Bernard Marr
Up until very recently, computers needed a complicated and extremely precise set of instructions in order to accomplish even the simplest of tasks.
Who among us remembers programming via punch cards? Or DOS?
Computer programming languages have evolved over the years, but the biggest step has been moving towards the elimination of complicated programming. In other words, teaching computers to learn for themselves, dubbed machine learning.
Because machine learning is such a promising leap forward in technological ability, it has the very real potential to affect every person in every field of business in the near future. Because of that, there are a few things I think everyone should know:
1. What is machine learning?
With machine learning, rather than telling a computer exactly how to solve a problem, the programmer instead tells it how to go about learning to solve the problem for itself.
Machine learning is, in essence, the very advanced application of statistics to learning to identify patterns in data and then make predictions from those patterns. (This website has a visualized walkthrough of how machine learning works, if you are interested.)
Machine learning started as far back as the 1950s, when computer scientists figured out how to teach a computer to play checkers. From there, as computational power has increased, so has the complexity of the patterns a computer can recognize, and therefore the predictions it can make and problems it can solve.
A machine learning algorithm is given a “teaching set” of data, then asked to use that data to answer a question. For example, you might provide a computer a teaching set of photographs, some of which say, “this is a cat” and some of which say, “this is not a cat.” Then you could show the computer a series of new photos and it would begin to identify which photos were of cats.
Machine learning then continues to add to its teaching set. Every photo that it identifies — correctly or incorrectly — gets added to the teaching set, and the program effectively gets “smarter” and better at completing its task over time.
It is, in effect, learning.
2. What makes machine learning so exciting?
Computers can now boldly go into realms that were once considered solidly humans’ domain. While the technology still isn’t perfect in many cases, the very concept of machine learning — that machines can continuously and tirelessly improve, with theoretically no ceiling to their performance — they will get better.
Computers can now “see” images and categorize them, as in our cat photo example above. They can “read” text and numbers in those images, as well as identify individual people and places. They can not only read text, but understand the context, including whether the sentiment is positive or negative.
Beyond that, computers can also listen to us, understand us, and respond. The virtual assistant in your pocket — be it Siri, Cortana, or Google Now — represents a major leap forward in the ability of computers to understand natural human speech, and they’re continuously improving.
Computers can also now write. Machine learning algorithms are already being used to write basic news articles in areas that require a lot of data, like financial reporting and even sports. This has broad implications for all kinds of data entry and classification tasks that previously required human intervention. If a computer can recognise something — an image, a document, a file, etc. — and describe it accurately, there could be many uses for such automation.
3. How is machine learning being used today?
People are already doing very exciting things with machine learning algorithms.
One study used computer assisted diagnosis (CAD) to review the early mammography scans of women who later developed breast cancer, and the computer spotted 52% of the cancers as much as a year before the women were officially diagnosed. Additionally, machine learning can be used to understand risk factors for disease in large populations. The company Medecision developed an algorithm that was able to identify eight variables to predict avoidable hospitalizations in diabetes patients.
Perhaps you’ve visited an online store and looked at a product but didn’t buy it — and then you see digital ads across the web for that exact product for days afterward. That kind of marketing personalization is just the tip of the iceberg. Companies can personalize which emails a customer receives, which direct mailings or coupons, which offers they see, which products show up as “recommended” and so on, all designed to lead the consumer more reliably toward a sale.
Natural language processing (NLP) is being used in all sorts of exciting applications across disciplines. Machine learning algorithms with natural language can stand in for customer service agents and more quickly route customers to the information they need. It’s being used to translate obscure legalese in contracts into plain language and help attorneys sort through large volumes of information to prepare for a case.
IBM recently surveyed top auto executives, and 74% expected that we would see smart cars on the road by 2025. A smart car would not only integrate into the Internet of Things, but also learn about its owner and its environment. It might adjust the internal settings — temperature, audio, seat position, etc. — automatically based on the driver, report and even fix problems itself, drive itself, and offer real time advice about traffic and road conditions.
4. What will machine learning do in the future?
The possibilities are practically limitless when it comes to what machine learning will be able to achieve in the future. Some exciting possibilities include:
- Personalized healthcare that looks at your genetic makeup and lifestyle factors to create unique health care and treatment plans for you.
- Data security programs that can detect malware, viruses and attacks with a high degree of accuracy.
- Computer assisted security at places like airports and stadiums that can predict who might be a threat and see things human screeners might miss.
- Self-driving cars that can navigate on their own and reduce traffic and accidents.
- Advanced fraud detection in financial and insurance fields to save us all money.
- Even a “universal translator” that will allow you to speak into your phone or other device and have the language translated accurately and instantly.
5. Why should I care about machine learning?
For many people, these advances will appear as welcome new technologies, but they won’t care how it works or what’s going on behind the scenes.
But we should all care, because for all the good things machine learning will bring, it could also change the shape of our workforce.
The application of machine learning to the ever-increasing amounts of data being produced by nearly every person on the planet will change everything when it comes to our jobs. Yes, these new technologies will make jobs easier for many people — but they also may make many of those jobs obsolete. Algorithms can now answer our emails, interpret medical images, find us the legal case to win, analyse our data, and more.
Machine learning relies on algorithms that “learn” from past examples, thereby relieving the programmer from having to write lines of code to deal with every eventuality. This ability to learn, coupled with advances in robotics and mobile technology, means that computers can now help humans perform complex tasks faster and better than ever before.
But what happens to the human they’ve outpaced?
The World Economic Forum postulated that we will lose 5 million jobs to computers and robots over the next five years.
That means that no matter what your job — from paralegal to diagnostician to customer service rep to truck driver — you need to be paying attention to how machine learning could affect your field, your business, and your job.
The best way to ensure that you aren’t taken by surprise when computers start to take these jobs is to be proactive and prepared now.
Disclaimer: This is a curated post. The statements, opinions and data contained in this column are solely those of the individual authors and contributors and not of iamwire or its editor(s). This article was initially published here.
Image Credit: respondr.io