The key challenges of migrating databases to the cloud

Man walking up staircase to the sky

With the pandemic forcing everyone to rethink their business operations, more and more companies are moving from on-premise storage to cloud-based environments. What's more, it's this push that is accelerating many newer technologies and driving up the use of applications such as AI and machine learning.

However, while bid data and cloud-based architectures can help to scale a business and improve operational efficiency, some key issues need to be monitored during the migration process.

What is cloud migration?

Simply put, it is the process of moving digital assets, services and IT resources - either entirely or partially - into the cloud. Although, a migration can consist of moving from one cloud environment to another.

One of the main reasons to migrate is digital transformation; moving on from old and outdated legacy infrastructure, such as ageing servers or unreliable appliances. However, there are reasons to take a hybrid approach, with a mix of cloud and on-prem working together.

It's worth noting, however, that digital transformation is a journey, not a destination and businesses should consider what they will use the cloud for and where they plan to take their business with it. Once in the cloud, the possibilities are, seemingly, endless.

The benefits of migrating to the cloud

Moving to a cloud environment can open up several operation possibilities but also reduce costs, such as ownership and maintenance fees for on-prem infrastructure. There are also potential benefits to operation speeds, agility and innovation which can also lead to reduced costs.

Remote working is also better managed with cloud infrastructure as companies have needed to become more elastic with their IT capabilities. Deploying software across distributed software is far easier with a cloud-based operation. Other benefits include increased agility, simplified IT management, and also a consistent availability of company resources.

Can a database be moved to the cloud in isolation?

Unlike a collaboration system, which is a natural candidate to move in isolation to other IT systems, a database is by its very design connected to a variety of other systems and services making up a multi-tiered architecture.

This architecture can then be queried and updated by users belonging to different groups, such as employees, partners, and customers, and also mined by analysts and researchers seeking future trends.

"It is vitally important to remember, in order for the database to be migrated successfully, that all the associated tiers and systems are known, their function clearly understood, and their performance benchmarked," says Paul Griffiths, senior director of Advanced Technology Group at Riverbed.

"Only with a complete view of all the components and their interactions with each other, can it be decided which elements need to be migrated along with the database itself."

Moving a database

Migrating a database to the cloud can seem like a daunting task, and in some cases, it's the initial challenge of figuring out what all the "moving parts" are which can cause migration projects to fall at the first hurdle.

"This can lead to organisations never being able to fully exploit the benefits of the cloud. While that may not bring about the demise of a company, it could affect how competitive they are able to be in the market," says Griffiths.

Schrock says that the first challenge is realising that database migration is not a one-time event. Moving your database to the cloud is only relevant if you migrate the application, too, which is cumbersome and can take days to weeks to months. During this time, teams must ensure that their applications will run in the cloud, that they can develop effectively within the cloud, and minimise disruption during the final transition.

RELATED RESOURCE

Navigating your hybrid cloud vision

Ensure business continuity and lower IT acquisition costs with on-premises private cloud

FREE DOWNLOAD

"All of this requires high-quality data in the cloud for continuous testing across the application, cutover process, and SDLC integration," says Schrock. "Failure to drive quality testing will at best slow down the project, and at worst cause significant disruption, poor quality post-transition, and an inability to move quickly to address problems."

Shaw says that security concerns are often another huge barrier when it comes to migrating databases to the cloud.

"Data has become the most valuable and important asset for all organisations and, therefore, effective protection is paramount for the success and future growth of the business as a whole," says Shaw. "If your business's data protection is not adequate and you suffer a breach then your reputation, brand and, by extension, the entire business is at risk. Hence the reluctance of some businesses to adopt new strategies and embrace cloud."

Move it, then monitor it

When a database has been migrated, it's important to use tools like SQL Server Query Store to monitor, evaluate and understand the baseline data - for example, transactional throughput and surges of daily/weekly/monthly activity.

"These metrics will help organisations to determine whether to move from standalone PaaS databases to PaaS Elastic database pools. Lastly, it makes good business sense to use the data to show the TCO reduction to key stakeholders such as the CTO," says Alan McAlpine, senior consultant of Enterprise Data at IT services firm ECS.

Leaving databases on-premise

Not all databases should go to the cloud, explains Roberto Mircoli, EMEA CTO of Virtustream. "Once you have identified a specialised enterprise-class cloud provider which guarantees the level of security, compliance, performance and availability required by your business, then what should be left on-premise is really only what's constrained by residual latency limitations or extremely stringent regulatory requirements."

"So for example, some particular applications in the automotive industry leverage manufacturing integration and intelligence to collect data in real-time from the automation systems to plan the procurement of components and materials; in these particular cases, 20, 15 or even 10ms of latency are not acceptable," he says.

This article was first published on 20/03/2018 and has since been updated

Rene Millman

Rene Millman is a freelance writer and broadcaster who covers cybersecurity, AI, IoT, and the cloud. He also works as a contributing analyst at GigaOm and has previously worked as an analyst for Gartner covering the infrastructure market. He has made numerous television appearances to give his views and expertise on technology trends and companies that affect and shape our lives. You can follow Rene Millman on Twitter.