From edge to cloud – and everywhere in between

How we use IT infrastructure has changed. How we manage it needs to change too

Data visualisation with cloud and document icons

It’s often said that data is the new oil, to the extent it’s become something of a cliché in the tech industry. Perhaps another, less tired comparison is that data is like work in Parkinson’s Law – it will expand not to fill the time available for its completion, but into every element of infrastructure and activity that an organisation has available.

This is increasingly true, in fact, as the Big Data revolution made all data interesting and valuable to a business, not just the information it was used to looking at. By analysing these vast data sets, new value streams can be uncovered, and both productivity and profits increased. With this in mind, it’s important to consider where new sources of data may come from and the infrastructure that underpins their collection, analysis and storage.

Data, data everywhere

Nowadays, technology moves rapidly. Even millennials and older members of Generation Z have experienced this incessant churn, with things that once seemed core to computing, such as floppy disks, becoming obsolete in their lifetime.

But it’s worth remembering that the pace of change between the 50s and early 90s was much slower. Even until the 2000s, the data centre and mainframe were the core of companies’ IT infrastructure. Almost without exception, these behemoths comprised hardware that was fully owned by the business and located on premises.

By the turn of the millennium, colocation providers had started to spring up, offering SMBs and enterprises the opportunity to lease hardware from another company, which would also look after software and infrastructure management. Over the following decade, this evolved into what we now know as public cloud computing.

It’s easy to think of this evolution as linear – as tapes were replaced by floppies, floppies were replaced by CDs, and CDs have in turn been displaced by flash thumb drives or direct download, so the humble data centre has been replaced by the public cloud.

That is not the case, however.

Many businesses, particularly from medium-sized up, still have an on-premises data centre. Some still use colocation. Most, if not all, organisations use public cloud services of one kind or another, be that infrastructure as a service, such as that offered by Amazon Web Services (AWS) and Microsoft Azure, or software as a service products like Workday or Box.

What has emerged is a melange of technologies – most recently joined by edge computing – that organisations are using to fulfil different business needs. But this expansion has been carried out in a largely unplanned way, which can make it difficult to manage.

More infrastructure, more problems

While concerns over ‘shadow IT’ in the early cloud years may have been overplayed, there’s no denying that modern IT infrastructure is far more difficult to manage than it was even 20 years ago.

Cloud sprawl, the addition of new technologies and services like edge and multi-cloud, and the rise of the distributed workforce have all made things trickier to manage. What’s more, we can’t expect the pace of change to stop – with increased mobility, the still growing Internet of Things, and technologies that haven’t even been invented yet, this labyrinth will grow ever harder to navigate and manage.

Yet there are few alternatives for IT departments than to accept that all these different elements are needed. If there’s an edge computing instance in their infrastructure, they will know of its existence and its importance; for this particular data source, analysis and feedback has to happen with minimum latency and there’s no alternative to edge for that. If the development or data science teams are using several AWS instances as well as on-premises infrastructure, it’s because they need it for a given project.

In short, not all data is created equal or performs the same functions. This doesn’t mean that IT departments are doomed to look after an increasingly complex infrastructure that doesn’t always play nicely together and can be both time consuming and costly to manage, though.

Managing a modern set-up

Some vendors have seen the problems IT departments face in confronting this increasing tech sprawl and have moved to resolve it, most often through computing on demand.

Computing on demand helps IT departments manage their companies’ use of infrastructure across all settings, from public cloud to on-premises data centres, to colocation, to branch offices and the edge.

One of the first movers in this area is Hewlett Packard Enterprise (HPE), with its comprehensive GreenLake offering.

GreenLake proposes not only better control over these types of mixed environments, it can also offer a more cost-effective solution. Designing and managing on-premises infrastructure, including data centres, edge environments, and branch offices, has long come with concerns over adequate provisioning. How do you ensure that you will have enough capacity for what the business will be doing in five or ten years time? This has often led to overprovisioning.

GreenLake, however, allows IT departments to spin up new compute instances, extend memory and more in a cloud-like way, all while keeping the data and processes on premises thanks to consumption-based pricing. This means the capacity is there should you need it, but you don’t actually pay for it unless you use it, which removes the problems associated with overprovisioning.

How we store and use data, and the technology we use to do that, will continue to evolve. Who would have thought, 10 years ago, that the cloud would play the role in business and in life that it does now? And that’s to say nothing of the growth of edge computing. As these new ideas continue to spring up and be added to organisations’ existing IT strategies, only truly adaptive solutions like GreenLake can help IT departments manage their infrastructure from the data centre, to the edge, to the cloud, and beyond.

Learn more about HPE GreenLake

Featured Resources

Next-generation time series: Forecasting for the real world, not the ideal world

Solve time series problems with AI

Free download

The future of productivity

Driving your business forward with Microsoft Office 365

Free download

How to plan for endpoint security against ever-evolving cyber threats

Safeguard your devices, data, and reputation

Free download

A quantitative comparison of UPS monitoring and servicing approaches across edge environments

Effective UPS fleet management

Free download

Recommended

HPE wins networking contract with Birmingham 2022 Commonwealth Games
Network & Internet

HPE wins networking contract with Birmingham 2022 Commonwealth Games

15 Oct 2021
Everything you need to know about HPE
Business strategy

Everything you need to know about HPE

1 Oct 2021
Silicon on-Demand: the evolution of HPE GreenLake
Sponsored

Silicon on-Demand: the evolution of HPE GreenLake

27 Sep 2021
HPE inks $2 billion high-performance computing deal with the NSA
high-performance computing (HPC)

HPE inks $2 billion high-performance computing deal with the NSA

1 Sep 2021

Most Popular

UK spy agencies supercharge espionage efforts with AWS data deal
cloud computing

UK spy agencies supercharge espionage efforts with AWS data deal

26 Oct 2021
Best Linux distros 2021
operating systems

Best Linux distros 2021

11 Oct 2021
Apple MacBook Pro 15in vs Dell XPS 15: Clash of the titans
Laptops

Apple MacBook Pro 15in vs Dell XPS 15: Clash of the titans

11 Oct 2021