IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

From edge to cloud – and everywhere in between

How we use IT infrastructure has changed. How we manage it needs to change too

Data visualisation with cloud and document icons

It’s often said that data is the new oil, to the extent it’s become something of a cliché in the tech industry. Perhaps another, less tired comparison is that data is like work in Parkinson’s Law – it will expand not to fill the time available for its completion, but into every element of infrastructure and activity that an organisation has available.

This is increasingly true, in fact, as the Big Data revolution made all data interesting and valuable to a business, not just the information it was used to looking at. By analysing these vast data sets, new value streams can be uncovered, and both productivity and profits increased. With this in mind, it’s important to consider where new sources of data may come from and the infrastructure that underpins their collection, analysis and storage.

Data, data everywhere

Nowadays, technology moves rapidly. Even millennials and older members of Generation Z have experienced this incessant churn, with things that once seemed core to computing, such as floppy disks, becoming obsolete in their lifetime.

But it’s worth remembering that the pace of change between the 50s and early 90s was much slower. Even until the 2000s, the data centre and mainframe were the core of companies’ IT infrastructure. Almost without exception, these behemoths comprised hardware that was fully owned by the business and located on premises.

By the turn of the millennium, colocation providers had started to spring up, offering SMBs and enterprises the opportunity to lease hardware from another company, which would also look after software and infrastructure management. Over the following decade, this evolved into what we now know as public cloud computing.

It’s easy to think of this evolution as linear – as tapes were replaced by floppies, floppies were replaced by CDs, and CDs have in turn been displaced by flash thumb drives or direct download, so the humble data centre has been replaced by the public cloud.

That is not the case, however.

Many businesses, particularly from medium-sized up, still have an on-premises data centre. Some still use colocation. Most, if not all, organisations use public cloud services of one kind or another, be that infrastructure as a service, such as that offered by Amazon Web Services (AWS) and Microsoft Azure, or software as a service products like Workday or Box.

What has emerged is a melange of technologies – most recently joined by edge computing – that organisations are using to fulfil different business needs. But this expansion has been carried out in a largely unplanned way, which can make it difficult to manage.

More infrastructure, more problems

While concerns over ‘shadow IT’ in the early cloud years may have been overplayed, there’s no denying that modern IT infrastructure is far more difficult to manage than it was even 20 years ago.

Cloud sprawl, the addition of new technologies and services like edge and multi-cloud, and the rise of the distributed workforce have all made things trickier to manage. What’s more, we can’t expect the pace of change to stop – with increased mobility, the still growing Internet of Things, and technologies that haven’t even been invented yet, this labyrinth will grow ever harder to navigate and manage.

Yet there are few alternatives for IT departments than to accept that all these different elements are needed. If there’s an edge computing instance in their infrastructure, they will know of its existence and its importance; for this particular data source, analysis and feedback has to happen with minimum latency and there’s no alternative to edge for that. If the development or data science teams are using several AWS instances as well as on-premises infrastructure, it’s because they need it for a given project.

In short, not all data is created equal or performs the same functions. This doesn’t mean that IT departments are doomed to look after an increasingly complex infrastructure that doesn’t always play nicely together and can be both time consuming and costly to manage, though.

Managing a modern set-up

Some vendors have seen the problems IT departments face in confronting this increasing tech sprawl and have moved to resolve it, most often through computing on demand.

Computing on demand helps IT departments manage their companies’ use of infrastructure across all settings, from public cloud to on-premises data centres, to colocation, to branch offices and the edge.

One of the first movers in this area is Hewlett Packard Enterprise (HPE), with its comprehensive GreenLake offering.

GreenLake proposes not only better control over these types of mixed environments, it can also offer a more cost-effective solution. Designing and managing on-premises infrastructure, including data centres, edge environments, and branch offices, has long come with concerns over adequate provisioning. How do you ensure that you will have enough capacity for what the business will be doing in five or ten years time? This has often led to overprovisioning.

GreenLake, however, allows IT departments to spin up new compute instances, extend memory and more in a cloud-like way, all while keeping the data and processes on premises thanks to consumption-based pricing. This means the capacity is there should you need it, but you don’t actually pay for it unless you use it, which removes the problems associated with overprovisioning.

How we store and use data, and the technology we use to do that, will continue to evolve. Who would have thought, 10 years ago, that the cloud would play the role in business and in life that it does now? And that’s to say nothing of the growth of edge computing. As these new ideas continue to spring up and be added to organisations’ existing IT strategies, only truly adaptive solutions like GreenLake can help IT departments manage their infrastructure from the data centre, to the edge, to the cloud, and beyond.

Learn more about HPE GreenLake

Featured Resources

Activation playbook: Deliver data that powers impactful, game-changing campaigns

Bringing together data and technology to drive better business outcomes

Free Download

In unpredictable times, a data strategy is key

Data processes are crucial to guide decisions and drive business growth

Free Download

Achieving resiliency with Everything-as-a-Service (XAAS)

Transforming the enterprise IT landscape

Free Download

What is contextual analytics?

Creating more customer value in HR software applications

Free Download

Recommended

HPE's new platform lets customers build machine learning models quickly and at scale
machine learning

HPE's new platform lets customers build machine learning models quickly and at scale

28 Apr 2022
Flexible IT for agile service providers
Whitepaper

Flexible IT for agile service providers

26 Apr 2022
HPE’s supercomputer helps ISS astronauts experiment in space
high-performance computing (HPC)

HPE’s supercomputer helps ISS astronauts experiment in space

5 Apr 2022
Scale automation more easily
Whitepaper

Scale automation more easily

5 Apr 2022

Most Popular

Open source packages with millions of installs hacked to harvest AWS credentials
hacking

Open source packages with millions of installs hacked to harvest AWS credentials

24 May 2022
Nvidia pauses hiring to help cope with inflation
Careers & training

Nvidia pauses hiring to help cope with inflation

23 May 2022
Microsoft finally adds Power BI integrations to PowerPoint and Outlook
business intelligence (BI)

Microsoft finally adds Power BI integrations to PowerPoint and Outlook

25 May 2022