What is cognitive computing?

Our guide to what is being described as the 'third age of computing'

For all the technological advancements in computing over the years, whether it be the internet, the cloud, or even artificial intelligence, it's never quite produced anything to match the sophistication of the human brain.

Our own internal supercomputer is capable of processing data in ways that have yet to be replicated fully. We may not be able to retain vast amounts of data or perform complex calculations on demand, but we are able to reason, predict, rationalise, and make our own decisions - skills that are unique to humans.

Yet that may not be true for much longer, as researchers are developing new systems that amalgamate the incredibly intricate processes of the human brain with the vast data stores of a computer.

What is cognitive computing?

This is precisely what the field of cognitive computing is trying to achieve. Computing based on cognition, or the processes of the human brain, involves creating systems that are able to self-learn, recognise patterns and objects, understand language, and ultimately operate without the input of a human.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

It's often thought of as the third age of computing, having first evolved from simple calculators in the early 1900s to the programmable machine we see mass produced today. It also forms the backbone of most of the experimental forms of computing we see making the news, whether it be AI, machine learning, robotics, neural networks or virtual reality.

Unlike a traditional system that simply performs the tasks a human has already programmed it to do, a cognitive computer is built using machine learning algorithms. The system acquires knowledge by sifting through vast quantities of data, slowly learning to spot patterns and recognise inconsistencies, which it then uses to create predictions. The more data a system is exposed to, the more accurate it becomes when it encounters something new.

Most importantly, cognitive computers are able to adapt to changing requirements or information, and use context to inform learning. In theory, there would never be a need to interfere with a cognitive system, as it would be able to change its parameters based on the needs of the user.

What can we do with it?

IBM's Watson famously demolished human players on Jeopardy in 2011

With its Watson supercomputer, IBM perhaps holds the crown for the most recognisable cognitive system today. With access to more than 90 servers and 200 million pages of information, it's able to replicate the way a human may answer a question, including the creation of hypotheses and using evidence to back up a theory.

It's already proved to have incredible potential - but only in narrow fields. For example, Watson is being trialled in the healthcare industry as a tool to work alongside doctors as a know-it-all assistant. It is able to draw upon patient records, previous cases, academic journals, and diagnostics to create a list of recommendations.

Advertisement - Article continues below

As Watson's healthcare results are based on live data, recommendations would take into account emerging breakthroughs and schools of thought at a far greater pace than a human doctor could. This is not to say what the system comes up with is necessarily the right course of action, but it offers a way of analysing data far more efficiently than currently possible for a human.

This type of information gathering could be applied to almost any industry and prove massively beneficial. Whether it be a cognitive system that's able to analyse legal documents and case precedents, or create highly personalised educational experiences based on the age of school children, or even predict criminal activity in a city by looking at strategies and statistics, there's scope for cognitive computing to completely revolutionise industries in a way not seen since the introduction of the programmable computer.

This doesn't mean it would be able to bring this level of computer 'intuition' to another area of expertise immediately though - it would have to ingest entirely new data sets and build its knowledge from the ground up.

Picture: Bigstock

Featured Resources

Digitally perfecting the supply chain

How new technologies are being leveraged to transform the manufacturing supply chain

Download now

Three keys to maximise application migration and modernisation success

Harness the benefits that modernised applications can offer

Download now

Your enterprise cloud solutions guide

Infrastructure designed to meet your company's IT needs for next-generation cloud applications

Download now

The 3 approaches of Breach and Attack Simulation technologies

A guide to the nuances of BAS, helping you stay one step ahead of cyber criminals

Download now
Advertisement

Most Popular

Visit/operating-systems/25802/17-windows-10-problems-and-how-to-fix-them
operating systems

17 Windows 10 problems - and how to fix them

13 Jan 2020
Visit/hardware/354584/windows-10-and-the-tools-for-agile-working
Sponsored

Windows 10 and the tools for agile working

20 Jan 2020
Visit/business-strategy/public-sector/354608/uk-gov-launches-ps300000-sen-edtech-initiative
public sector

UK gov launches £300,000 SEN EdTech initiative

22 Jan 2020
Visit/web-browser/30394/what-is-http-error-503-and-how-do-you-fix-it
web browser

What is HTTP error 503 and how do you fix it?

7 Jan 2020