How to improve database costs, performance and value
We look at some top tips to get more out of your databases
Databases are one of the most foundational building-blocks of enterprise IT, but running them effectively can be surprisingly challenging. Far too often, poorly-optimised databases can end up sucking up time, generating unnecessary costs and holding IT teams back.
There are, however, a number of tips and tricks that database administrators (DBAs) can utilise to speed up their database management, making it more efficient and delivering improved value. We’ve listed some of our top tips below, but for even more insight, register now for our live webinar on getting the most out of Toad for Oracle, presented in association with Quest Software. Registration is free, and the webinar will be streaming live on 7 October from 3PM BST/10am EST/7am PST.
Automate management tasks
There’s a huge range of tasks that DBAs typically need to perform, covering everything from storage management and configuration to backup and recovery jobs. Many of these tasks are mundane and regularly repeated, however, making them perfect for automation. Instead of spending their valuable time doing the same simple jobs over and over, automating them frees DBAs up to undertake more complex but rewarding jobs.
There are numerous ways to automate database management tasks; many DBAs write their own custom tools that are crafted to their unique needs, but some database management software vendors build automation tools directly into their products. However you choose to do it, automating elements of your database management can not only save you time and money, it can also help reduce recovery time in the event of a failure.
Deploy containers and virtual machines
Virtualisation and multi-tenancy are nothing new in the worlds of enterprise IT and software development, but they can have big advantages for database management. If you’re running your databases on single-tenant servers, then porting them to VMs could bring big savings. For starters, this allows you to host multiple databases on the same infrastructure, which can cut down on licensing and hardware investment costs. It also offers more flexibility in porting, replicating and modifying databases, although some may find vertical scaling becomes challenging.
If you want even more portability, you may wish to consider containerising your databases. This works well as part of a DevOps environment, and integrates nicely with both CI/CD pipelines and multi-cloud architectures. The lightweight nature of containers means you can run large amounts on a single server, and they also lend themselves to high levels of automation which (as we covered above) can have big benefits for cost and efficiency.
Optimise SQL queries
Not all database queries are created equal. In fact, inefficient or poorly-structured SQL queries often take longer to run, and when run on live production databases they can introduce errors and cause outages for other database users. By putting in a little more legwork in fine-tuning their initial queries, database operators can speed up their operations to improve the long-term health and performance of their databases.
For example, when selecting fields as part of a query, it can be tempting to use SELECT* to quickly select all records, or using SELECT DISTINCT to identify unique results. However, this can lead to unnecessary processing if it’s being run on large databases. Instead, thinking about the results you’re looking for and structuring your query around those specific fields will cut down on the processing power required to run it. Similarly, using INNER JOINs instead of Cartesian Joins made using WHERE clauses can massively reduce the amount of work being done by the system.
Manage database growth
One of the biggest factors that affects both the performance and speed of a database is how large it is. The bigger it is, the longer it takes to search through it and deliver results from a query. Size also plays a factor in process and transaction costs for cloud-hosted databases, or in hardware upgrade cadences for on-premise systems.
The best way to ensure that your database doesn’t expand at an unsustainable rate is to set up your schema and validation rules so that it contains only data which is going to be necessary to its operation. Storing large amounts of infrequently or partially used records can increase the cost and the time it takes to run queries.
Retire old or unnecessary data
As we’ve just covered, keeping your database streamlined is a key part of making sure they remain efficient and cost-effective. However, even if you’re restricting database inputs to the bare minimum of fields that need to be there, the size of your database will inevitably swell over time. If you’re finding that your database is getting a bit unwieldy, it might be worth seeing if you can streamline it by removing old entries that no longer need to be in there.
The specifics of what records can safely be deleted and when will depend on the specific purpose of your database, but setting clear data deletion policies can help keep its size to a manageable level. Another added bonus is that this can ensure that your queries aren’t returning as many irrelevant junk results, and may even help with GDPR compliance in the case of databases containing personal information. Large tables can also be partitioned to help stop them getting too big, and indexes should be monitored to ensure they're still being used; any that aren't can be removed to free up additional space.
Time your queries
The beauty of databases is that many of them are multi-purpose, and can be used by a variety of different applications and users for different things. However, it’s important to consider the impact that queries can have on performance for the rest of the organisation. For example, if you’re running large, complex queries that require a lot of processing power, this may cause other queries and applications to run slower.
If you’re planning to run any particularly heavyweight queries, consider how time-sensitive they are; if they’re not especially urgent, it might be worth scheduling them to run outside of peak business hours when there’s likely to be less activity on the database. This will ensure that disruption is kept to a minimum, and reduce the impact on the database’s efficiency.
Perform regular audits
All of the tips we’ve mentioned above are good starting points, but databases are like a house; over time, if you don’t keep an eye on them, it’s easy for clutter to build up. TO keep your database in peak condition, it’s essential to schedule regular audits of things like the schema, input rules, data deletion policies, and usage patterns to ensure that all of them are in line with the business needs that the database is serving.
Making time for tasks like this may not seem like a priority when there are so many other things to be done, but regular housekeeping of your database can pay dividends in the long term, with faster performance, operational cost savings and fewer errors.
Managing security risk and compliance in a challenging landscape
How key technology partners grow with your organisationDownload now
Evaluate your order-to-cash process
15 recommended metrics to benchmark your O2C operationsDownload now
AI 360: Hold, fold, or double down?
How AI can benefit your businessDownload now
Getting started with Azure Red Hat OpenShift
A developer’s guide to improving application building and deployment capabilitiesDownload now