What is cognitive computing?

Our guide to what is being described as the 'third age of computing'

For all the technological advancements in computing over the years, whether it be the internet, the cloud, or even artificial intelligence, it's never quite produced anything to match the sophistication of the human brain.

Our own internal supercomputer is capable of processing data in ways that have yet to be replicated fully. We may not be able to retain vast amounts of data or perform complex calculations on demand, but we are able to reason, predict, rationalise, and make our own decisions - skills that are unique to humans.

Yet that may not be true for much longer, as researchers are developing new systems that amalgamate the incredibly intricate processes of the human brain with the vast data stores of a computer.

What is cognitive computing?

This is precisely what the field of cognitive computing is trying to achieve. Computing based on cognition, or the processes of the human brain, involves creating systems that are able to self-learn, recognise patterns and objects, understand language, and ultimately operate without the input of a human.

Advertisement
Advertisement - Article continues below

It's often thought of as the third age of computing, having first evolved from simple calculators in the early 1900s to the programmable machine we see mass produced today. It also forms the backbone of most of the experimental forms of computing we see making the news, whether it be AI, machine learning, robotics, neural networks or virtual reality.

Unlike a traditional system that simply performs the tasks a human has already programmed it to do, a cognitive computer is built using machine learning algorithms. The system acquires knowledge by sifting through vast quantities of data, slowly learning to spot patterns and recognise inconsistencies, which it then uses to create predictions. The more data a system is exposed to, the more accurate it becomes when it encounters something new.

Most importantly, cognitive computers are able to adapt to changing requirements or information, and use context to inform learning. In theory, there would never be a need to interfere with a cognitive system, as it would be able to change its parameters based on the needs of the user.

What can we do with it?

IBM's Watson famously demolished human players on Jeopardy in 2011

With its Watson supercomputer, IBM perhaps holds the crown for the most recognisable cognitive system today. With access to more than 90 servers and 200 million pages of information, it's able to replicate the way a human may answer a question, including the creation of hypotheses and using evidence to back up a theory.

It's already proved to have incredible potential - but only in narrow fields. For example, Watson is being trialled in the healthcare industry as a tool to work alongside doctors as a know-it-all assistant. It is able to draw upon patient records, previous cases, academic journals, and diagnostics to create a list of recommendations.

As Watson's healthcare results are based on live data, recommendations would take into account emerging breakthroughs and schools of thought at a far greater pace than a human doctor could. This is not to say what the system comes up with is necessarily the right course of action, but it offers a way of analysing data far more efficiently than currently possible for a human.

This type of information gathering could be applied to almost any industry and prove massively beneficial. Whether it be a cognitive system that's able to analyse legal documents and case precedents, or create highly personalised educational experiences based on the age of school children, or even predict criminal activity in a city by looking at strategies and statistics, there's scope for cognitive computing to completely revolutionise industries in a way not seen since the introduction of the programmable computer.

This doesn't mean it would be able to bring this level of computer 'intuition' to another area of expertise immediately though - it would have to ingest entirely new data sets and build its knowledge from the ground up.

Picture: Bigstock

Featured Resources

The IT Pro guide to Windows 10 migration

Everything you need to know for a successful transition

Download now

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Software-defined storage for dummies

Control storage costs, eliminate storage bottlenecks and solve storage management challenges

Download now

6 best practices for escaping ransomware

A complete guide to tackling ransomware attacks

Download now
Advertisement

Most Popular

Visit/cloud/microsoft-azure/354230/microsoft-not-amazon-is-going-to-win-the-cloud-wars
Microsoft Azure

Microsoft, not Amazon, is going to win the cloud wars

30 Nov 2019
Visit/hardware/354237/five-signs-that-its-time-to-retire-it-kit
Sponsored

Five signs that it’s time to retire IT kit

29 Nov 2019
Visit/business/business-strategy/354252/huawei-takes-the-us-trade-sanctions-into-its-own-hands
Business strategy

Huawei takes the US trade sanctions into its own hands

3 Dec 2019
Visit/mobile/mobile-phones/354273/pablo-escobars-brother-launches-budget-foldable-phone
Mobile Phones

Pablo Escobar's brother launches budget foldable phone

4 Dec 2019