What is big data analytics?
Explaining descriptive, predictive and prescriptive methods of looking at data
Big Data has been a tech buzzword for almost a decade, with promises that collecting together all the information generated from websites, tills, devices, machines and more can increase profits and slash inefficiencies.
While it's true that Big Data is a valuable asset with some dubbing it the "new oil" much like any resource it's worthless if you don't do anything with it.
This is where Big Data analytics comes in. By applying specialist software and systems to these vast collections of information, organisations are able to gain actionable insights that can help move them along the path of those fabled mega margins.
What is Big Data?
To understand Big Data analytics, you first need to understand what's being analysed.
Big Data is defined by three "v"s volume, velocity and variety. This is to say with modern connectivity the way it is, there's a huge amount of information being generated almost every second of every day and because it's coming from different sources, it's in all different formats.
For the purposes of analysis, what matters most is this last element. The variety of data sources available now is immense: Organisations may receive information from loyalty card schemes, website interactions, CCTV cameras, reviews, app use data and more. All this data can be separated into two categories: Structured and unstructured.
Structured data is what might come to mind when you think of "data" as a concept information stored in a spreadsheet or database for example.
Unstructured data, on the other hand, is the kind of information found in emails, phone calls and other more freeform configurations.
Big Data analytics programs, such as Spark, Hadoop, NoSQL and MapReduce, are able to analyse both structured and unstructured data from a wide variety of sources, identifying meaningful patterns that can be used to drive new business initiatives or tweak strategies
Types of analytics
Businesses need to be aware of the three types of analytics that can be deployed with Big Data.
The first is descriptive - for example, notifications, alerts and dashboards. These tell you what has previously happened, but don't give the reasons why it happened or what may change.
Next is predictive, which is a more useful form of analytics. This uses past data to model what could happen in the future. For example, how sales could be affected by marketing conditions, or how a customer might respond to a marketing campaign.
Finally, there's prescriptive analytics. This uses techniques such as A/B testing or optimisation testing to advise managers and employees on how best to fulfil their roles within an organisation. For example, it could help a police officer predict criminal activity, inform a salesperson on what types of discounts to offer customers or tell a web developer what ad will work best on a webpage.
Trends in analytics
Tools to analyse data, be it in a data lake that stores data in its native format, or in a data warehouse, are still emerging. There will be a number of trends that will determine how Big Data and associated analytics will operate in the future.
First is analytics in the cloud. As with a lot of things, Big Data analytics is moving to the cloud. Hadoop can now process large datasets in the cloud, even though it was originally designed to do so on physical machine clusters. Among the companies offering Hadoop-based services in the cloud are IBM Cloud, Amazon's Redshift hosted by BI data warehouse, Google's BigQuery data analytics service and Kinesis data processing service.
The use of predictive analytics is also increasing. As technologies become more powerful, larger datasets can be analysed and this in turn will increase predictability.
Finally, there's deep learning. This is a set of machine-learning techniques that use neural networks to find interesting patterns in massive quantities of binary and unstructured data, and infer relationships without needing explicit programming or models. One deep learning algorithm has been used to look at Wikipedia data to learn that California and Texas are US states.
The combination of Big Data and analytics is an important part of keeping organisations one step ahead of the competition. But these businesses must also create the right conditions to enable data scientists and analysts to test theories based on the data that they have.
Digitally perfecting the supply chain
How new technologies are being leveraged to transform the manufacturing supply chainDownload now
Three keys to maximise application migration and modernisation success
Harness the benefits that modernised applications can offerDownload now
Your enterprise cloud solutions guide
Infrastructure designed to meet your company's IT needs for next-generation cloud applicationsDownload now
The 3 approaches of Breach and Attack Simulation technologies
A guide to the nuances of BAS, helping you stay one step ahead of cyber criminalsDownload now