What is cognitive computing?

Our guide to what is being described as the 'third age of computing'

For all the technological advancements in computing over the years, whether it be the internet, the cloud, or even artificial intelligence, it's never quite produced anything to match the sophistication of the human brain.

Our own internal supercomputer is capable of processing data in ways that have yet to be replicated fully. We may not be able to retain vast amounts of data or perform complex calculations on demand, but we are able to reason, predict, rationalise, and make our own decisions - skills that are unique to humans.

Yet that may not be true for much longer, as researchers are developing new systems that amalgamate the incredibly intricate processes of the human brain with the vast data stores of a computer.

What is cognitive computing?

This is precisely what the field of cognitive computing is trying to achieve. Computing based on cognition, or the processes of the human brain, involves creating systems that are able to self-learn, recognise patterns and objects, understand language, and ultimately operate without the input of a human.

It's often thought of as the third age of computing, having first evolved from simple calculators in the early 1900s to the programmable machine we see mass produced today. It also forms the backbone of most of the experimental forms of computing we see making the news, whether it be AI, machine learning, robotics, neural networks or virtual reality.

Unlike a traditional system that simply performs the tasks a human has already programmed it to do, a cognitive computer is built using machine learning algorithms. The system acquires knowledge by sifting through vast quantities of data, slowly learning to spot patterns and recognise inconsistencies, which it then uses to create predictions. The more data a system is exposed to, the more accurate it becomes when it encounters something new.

Most importantly, cognitive computers are able to adapt to changing requirements or information, and use context to inform learning. In theory, there would never be a need to interfere with a cognitive system, as it would be able to change its parameters based on the needs of the user.

What can we do with it?

IBM's Watson famously demolished human players on Jeopardy in 2011

With its Watson supercomputer, IBM perhaps holds the crown for the most recognisable cognitive system today. With access to more than 90 servers and 200 million pages of information, it's able to replicate the way a human may answer a question, including the creation of hypotheses and using evidence to back up a theory.

It's already proved to have incredible potential - but only in narrow fields. For example, Watson is being trialled in the healthcare industry as a tool to work alongside doctors as a know-it-all assistant. It is able to draw upon patient records, previous cases, academic journals, and diagnostics to create a list of recommendations.

As Watson's healthcare results are based on live data, recommendations would take into account emerging breakthroughs and schools of thought at a far greater pace than a human doctor could. This is not to say what the system comes up with is necessarily the right course of action, but it offers a way of analysing data far more efficiently than currently possible for a human.

This type of information gathering could be applied to almost any industry and prove massively beneficial. Whether it be a cognitive system that's able to analyse legal documents and case precedents, or create highly personalised educational experiences based on the age of school children, or even predict criminal activity in a city by looking at strategies and statistics, there's scope for cognitive computing to completely revolutionise industries in a way not seen since the introduction of the programmable computer.

This doesn't mean it would be able to bring this level of computer 'intuition' to another area of expertise immediately though - it would have to ingest entirely new data sets and build its knowledge from the ground up.

Picture: Bigstock

Featured Resources

Digital document processes in 2020: A spotlight on Western Europe

The shift from best practice to business necessity

Download now

Four security considerations for cloud migration

The good, the bad, and the ugly of cloud computing

Download now

VR leads the way in manufacturing

How VR is digitally transforming our world

Download now

Deeper than digital

Top-performing modern enterprises show why more perfect software is fundamental to success

Download now

Most Popular

The top 12 password-cracking techniques used by hackers
Security

The top 12 password-cracking techniques used by hackers

5 Oct 2020
The enemy of security is complexity
Sponsored

The enemy of security is complexity

9 Oct 2020
What is a 502 bad gateway and how do you fix it?
web hosting

What is a 502 bad gateway and how do you fix it?

5 Oct 2020