How to solve your big data problem
Big data brings new challenges for business, but Intel’s latest technology can help you meet them and make the most of the potential
Big data should mean big opportunities for every business, yet for many the potential is never realised. Data that could be mined for value ends up resting, unused in archives. Insights that could be fuelling sales or enhancing customer experiences are never uncovered, or never make it to the screen of someone who could make the difference. Every day, some 2.5 quintillion bytes of data is created. The volume of data being produced is expected to be 44 times greater in 2020 than it was in 2009.
Yet in 2016 the analysts at Forrester estimated that, on average, between 60% and 73% of all the data within an organisation went untouched by analytics or BI applications, and in some industries the figure may be closer to 95% or more. To call this a shame is an understatement. EU studies have shown that companies that adopt big data analytics can increase productivity by 5% to 10% over companies that don't, and that big data practices in Europe could add 1.9% to GDP by 2020.
For many companies, some of this will come down to a lack of processes to capture, refine and structure the right data, or to a lack of the skilled workers needed to extract the most value. Yet technological issues also have a large part to play.
For a start, too many big data initiatives are held back by poor performance and underwhelming results. Big data analytics applications are extremely hardware-intensive, pushing not just compute resources to their limit running complex operations on huge datasets is never easy but also storage and network resources. Processor cores can be sitting, ready to go, but they're being bottlenecked by slow transfers of cold' data in slower, mass data storage devices to hot' data resources, specifically DRAM, in more direct contact with the CPU. Today's business thrives on speed and agility, and when mining data for insight takes too long, the excitement around these new applications dwindles. BI and analytics projects that should be powering growth become unloved and under-used.
What's more, these initiatives are expensive. Smaller and medium-sized enterprises baulk at the cost of the hardware needed to run these applications, and the storage and supporting infrastructure required to store, move and transport all that data. Worse, the real-time, in-memory applications being used by larger enterprises to analyse data at the point it flows into the enterprise are, financially-speaking, out of reach. This isn't simply because of licensing costs there are less-expensive and open-source alternatives to the big names but because the processing power, storage and high-capacity DRAM required doesn't come cheap. It's a sizable investment perhaps too sizable for many smaller businesses.
Meeting the Big Data Challenge
So, what can businesses do when confronted with these issues? There are clearly steps they can take in terms of data capture and preparation, processes and recruitment of data specialists, but technology also has its part to play in solving our big data problems, and more specifically, the technology in Intel's new Xeon Scalable processor architecture and Intel's Optane storage and memory products.
Let's start with Xeon Scalable. Its new mesh architecture, where CPU cores are linked to each other and to memory, network and storage resources by a mesh of connections, enables Xeon Scalable processors to handle demanding, data-intensive tasks more efficiently than the old Xeon architecture, where everything connected through a single central ring. Its new AVX-512 instructions are purpose built to accelerate performance in the compute and data-focused workloads characteristic of big data analytics. Running batch analytics, new Intel Xeon Scalable processors performed 1.4 times faster, on average, in comparison to the previous generation Intel Xeon, while enterprises running Cassandra NoSQL databases have seen up to 4.6 times the number of operations per second. Running SAP HANA in-memory workloads, Xeon Scalable shows nearly a 60% boost in performance.
Yet there's more to this than just running workloads faster. With Intel Xeon Scalable behind them, enterprises can think about using machine learning and predictive analytics to find, refine and manage data before its processed, helping to speed up analytics operations and deliver more accurate, more useful and more actionable results.
With 48 lanes of PCIe 3.0 connectivity per CPU, Xeon Scalable makes the perfect partner for Intel's latest Optane storage devices. Storage has always been a bottleneck for big data, but with reduced latency and stronger performance on write-intensive jobs than conventional flash SSDs, Optane drives dispatch with the old limitations. Optane's resilience up to 10x that of a standard SSD makes it a better choice for caching or short-term storage, creating a zone for warm' data between the existing hot' DRAM and cold' storage zones. Put Intel Xeon Scalable and Optane together, and you can see performance in SAS workloads double over Intel's previous best-of-breed platform.
Last, but certainly not least, Intel's new Optane DC Persistent Memory combines the capacity of flash with near-DRAM levels of performance, at significantly lower costs than DRAM but in a standard DRAM DIMM form factor. With support from the new Xeon Scalable processors, this enables enterprises to run real-time analytics in-memory without anything like the same levels of investment. Potentially, this could see a much broader range of enterprises using these applications and turning data into insight at the speed of modern business. It's a step that enables more companies to make more effective use of the data flowing through the business and one that levels the playing fields a little, so that smaller organisations can make the most of their agility and compete.
Can technology alone solve your big data problem? No, but it can help you mitigate the issues, prepare your data and develop approaches, applications and processes that work for you. The hardware is here to smooth out your analytics journey it's far too soon to stop it now.
Unlocking collaboration: Making software work better together
How to improve collaboration and agility with the right techDownload now
Four steps to field service excellence
How to thrive in the experience economyDownload now
Six things a developer should know about Postgres
Why enterprises are choosing PostgreSQLDownload now
The path to CX excellence for B2B services
The four stages to thrive in the experience economyDownload now