Machine learning vs AI

These two terms are often used interchangeably, but they are fundamentally different technologies

We're in an exciting time when it comes to the birth of new digital technologies. Things that within most adults' lifetimes were the stuff of science fiction are increasingly becoming part of everyday life.

Nothing's ever simple in the world of tech, though, and perhaps because of this transition from the fantastical to the mundane there can be confusion between different terms and the technologies they refer to.

Advertisement - Article continues below

Artificial intelligence (AI) and machine learning (ML) are two such terms: Although they're often used interchangeably in popular culture and media, they aren't actually the same thing.

The reason for the confusion is likely twofold. Firstly, AI is an attribute applied quite loosely to anything deemed "smart". Home assistants, such as Google Home, Alexa and Siri, are considered AI and are probably the scenario where most people encounter the technology, but it can also refer to some forms of suggested text, such as during email composition. Away from human interactions, AI is also interwoven in other technologies behind the scenes  intelligent load balancing, for example, or some network security systems.

Secondly, machine learning is a subset of AI, meaning that while ML is AI, AI is not necessarily ML. To confuse matters further, ML also has various subdisciplines of its own, such as deep learning and reinforcement learning.

What's the difference between ML and AI?

The history of AI is a long one for thousands of years humans have dreamt of and mythologised machines and other creations that could "come to life", behaving and thinking as humans do. There was a time when early computers, due to their "logical" nature, were also considered a type of artificial intelligence.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

In its current manifestation, however, the idea of AI can trace its history to British computer scientist and World War II codebreaker Alan Turing. He proposed a test, which he called the imitation game but is more commonly now known as the Turing Test, where one individual converses with two others, one of which is a machine, through a text-only channel. If the interrogator is unable to tell the difference between the machine and the person, the machine is considered to have "passed" the test.

This basic concept is referred to as "general AI" and generally considered to be something that researchers have yet to fully achieve.

However, "narrow" or "applied" AI has been far more successful at creating working models. Rather than attempt to create a machine that can do everything, this field attempts to create a system that can perform a single task as well as, if not better than, a human.

Advertisement - Article continues below

It's within this narrow AI discipline that the idea of machine learning first emerged, as early as the middle of the twentieth century in fact. First defined by AI pioneer Arthur Samuel in a 1959 academic paper, ML represents "the ability to learn without being explicitly programmed". In other words, where narrow AI may rely on human-built algorithms, a system built using ML could create its own.

Uses and applications

Machine learning

Interest in ML has waxed and waned over the years, but with data becoming an increasingly important part of business strategy, it's fallen back into favour as organisations seek ways to analyse and make use of the vast quantities of information they collect on an almost constant basis.

When this data is put into a machine learning program, the software not only analyses it but learns something new with each new dataset, becoming a growing source of intelligence. This means the insights that can be learnt from data sources become more advanced and more informative, helping companies develop their business in line with customer expectations.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

One application of ML is in a recommendation engine, like Facebook's newsfeed algorithm, or Amazon's product recommendation feature. ML can analyse how many people are liking, commenting on or sharing posts or what other people are buying that have similar interests. It will then show the post to others the company think will like it.

ML is also particularly useful for image recognition, using humans to identify what's in a picture as a kind of programming and then using this to autonomously identify what's in a picture. For example, machine learning can identify the pixels distribution used in a picture, working out what the subject is.

Enterprises are now turning to ML to drive predictive analytics, as big data analysis becomes increasingly widespread. The association with statistics, data mining and predictive analysis have become dominant enough for some to argue that machine learning is a separate field from AI.

The reason for this is that AI technology, such as natural language processing or automated reasoning, can be done without having the capability for machine learning. It is not always necessary for ML systems to have other features of AI.

Advertisement - Article continues below

AI

There are hundreds of use cases for AI, and more are becoming apparent as companies adopt artificial intelligence to tackle business challenges.

One of the most common uses of AI is in automation in cyber security, for example, AI algorithms can be programmed to detect threats that may be difficult for a human to spot, such as subtle changes in user behaviour or an unexplained increase amount of data being transferred to and from a particular node (eg a computer or sensor). In the home, assistants like Google home or Alexa can help automate lighting, heating and interactions with businesses through chatbots.

There are well-founded fears that AI will replace human job roles, such as data input, at a faster rate than the job market will be able to adapt to. Author and venture capitalist Kai-Fu Lee, who has worked at both Apple and Google and earned a PhD from Carnegie Mellon for the development of an advanced speech recognition AI, warned in 2019 that "Many jobs that seem a little bit complex, a chef, a waiter, a lot of things, will become automated.

Advertisement - Article continues below

"We will have automated stores, automated restaurants and all together, in 15-years, that's going to displace about 40% of jobs in the world."

Confusing AI and ML

Beyond machine learning and AI, there are a host of other terms often thrown into the mix - to confuse things further. For example, artificial neural networks are designed to process information in a way similar to the human mind and can be used for machine learning too although, once again, not all neural nets are AI or ML, and not every ML program uses an underlying neural net.

As this is a developing field, terms are popping in and out of existence all the time and the barriers between the different areas of AI are still quite permeable. As the technology becomes more widespread and more mature, these definitions will likely also become more concrete and well known. Or who knows, we may develop generalised AI and all the definitions will fall away.

Featured Resources

Top 5 challenges of migrating applications to the cloud

Explore how VMware Cloud on AWS helps to address common cloud migration challenges

Download now

3 reasons why now is the time to rethink your network

Changing requirements call for new solutions

Download now

All-flash buyer’s guide

Tips for evaluating Solid-State Arrays

Download now

Enabling enterprise machine and deep learning with intelligent storage

The power of AI can only be realised through efficient and performant delivery of data

Download now
Advertisement
Advertisement

Recommended

Visit/business-strategy/31780/the-it-pro-panel
Business strategy

The IT Pro Panel

24 Feb 2020
Visit/technology/artificial-intelligence-ai/354766/mit-develops-ai-tech-to-edit-outdated-wikipedia
artificial intelligence (AI)

MIT develops AI tech to edit outdated Wikipedia articles

13 Feb 2020
Visit/technology/30736/what-is-ethical-ai
Technology

What is ethical AI?

11 Feb 2020
Visit/careers/28212/a-guide-to-cyber-security-certification-and-training
Careers & training

A guide to cyber security certification and training

13 Jan 2020

Most Popular

Visit/security/privacy/355155/zoom-kills-facebook-integration-after-data-transfer-backlash
privacy

Zoom kills Facebook integration after data transfer backlash

30 Mar 2020
Visit/security/data-breaches/355173/marriott-hit-by-data-breach-exposing-personal-data-of-52-million
data breaches

Marriott data breach exposes personal data of 5.2 million guests

31 Mar 2020
Visit/security/cyber-crime/355171/fbi-warns-of-zoom-bombing-hackers-amidst-coronavirus-usage-spike
cyber crime

FBI warns of ‘Zoom-bombing’ hackers amid coronavirus usage spike

31 Mar 2020
Visit/data-insights/data-management/355170/oracle-cloud-courses-are-free-during-coronavirus-lockdown
data management

Oracle cloud courses are free during coronavirus lockdown

31 Mar 2020