What is cache memory?

We explain the different categories of cache memory and how it differs from RAM

Processor

Just like with humans, computer systems use various types of memory that work together to keep it running smoothly. Some are types of short-term memory for regular and simple tasks and others are used for longer, more data-heavy functions, but all are vital to the overall operation of both the software and hardware of a computer.

The term 'memory' is often used to describe information storage, but there are memory components that have meaning and uses beyond that remit, such as encoding and data retrieval, which is a core part of cache memory. As a single entity, cache memory is near useless, but it plays an important role when it interacts with other parts of a computer system.

It allows computer functions to hold recently-accessed data close by, so it can be repeatedly used, rather than using the same set of instructions again and again. This is why systems with a larger capacity of cache memory often operate quicker because they can store more data.

Cache memory vs RAM

In a technical sense, cache memory and random-access memory (RAM) sound like similar functions, but there are some notable differences. For instance, cache memory stores data for future operational purposes, so those functions can work straight away, whereas RAM stores application and operational data that are currently in use.

Cache memory is also faster as it sits closer to the central processing unit (CPU) than RAM does. Cache memory is also generally smaller than the RAM as it only needs to store data that the CPU relies on for future operations.

Cache memory types

Cache memory can be complicated, however; not only is it different to the standard DRAM that most people are familiar with, but there are also multiple different kinds of cache memory.

Cache memory generally tends to operate in a number of different configurations: direct mapping, fully associative mapping and set associative mapping.

Direct mapping features blocks of memory mapped to specific locations within the cache, while fully associative mapping lets any cache location be used to map a block, rather than requiring the location to be pre-set. Set associative mapping acts as a halfway-house between the two, in that every block is mapped to a smaller subset of locations within the cache.

Cache memory grading

There are three different categories, graded in levels: L1, L2 and L3. L1 cache is generally built into the processor chip and is the smallest in size, ranging from 8KB to 64KB. However, it's also the fastest type of memory for the CPU to read. Multi-core CPUs will generally have a separate L1 cache for each core.

L2 and L3 caches are larger than L1, but take longer to access. L2 cache is occasionally part of the CPU, but often a separate chip between the CPU and the RAM.

Graphics processing units (GPUs) often have a separate cache memory to the CPU, which ensures that the GPU can still speedily complete complex rendering operations without relying on the relatively high-latency system RAM.

Featured Resources

How to be an MSP: Seven steps to success

Building your business from the ground up

Download now

The smart buyer’s guide to flash

Find out whether flash storage is right for your business

Download now

How MSPs build outperforming sales teams

The definitive guide to sales

Download now

The business guide to ransomware

Everything you need to know to keep your company afloat

Download now

Most Popular

KPMG offers staff 'four-day fortnight' in hybrid work plans
flexible working

KPMG offers staff 'four-day fortnight' in hybrid work plans

6 May 2021
16 ways to speed up your laptop
Laptops

16 ways to speed up your laptop

29 Apr 2021
How to move Windows 10 from your old hard drive to SSD
operating systems

How to move Windows 10 from your old hard drive to SSD

30 Apr 2021