0% found this document useful (0 votes)
2 views

Module

Cache memory is a fast storage area between the CPU and RAM, consisting of a directory store to track data locations, a data section for storing data chunks, and status bits to manage data freshness. It improves data access speed by allowing the processor to quickly retrieve data from the cache instead of the slower main memory. The architecture includes components like cache lines and bits indicating whether data is valid or has been modified.

Uploaded by

Gagan M
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Module

Cache memory is a fast storage area between the CPU and RAM, consisting of a directory store to track data locations, a data section for storing data chunks, and status bits to manage data freshness. It improves data access speed by allowing the processor to quickly retrieve data from the cache instead of the slower main memory. The architecture includes components like cache lines and bits indicating whether data is valid or has been modified.

Uploaded by

Gagan M
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Module-5

Basic Architecture of Cache Memory: A Simple Explanation

Cache memory helps your computer quickly access data. Think of it as a small, super-fast storage area that sits
between the processor (CPU) and the main memory (RAM). Its architecture is made up of three main parts:

1. Directory Store (Cache-Tag)

 What it does: It’s like a “map” that keeps track of where data came from in the main memory.

 How it works:

o When the processor wants some data, it looks at this map to check if the data is already in the cache.

o If the "address" on the map matches what the processor wants, it’s called a cache hit, and the data is
delivered quickly.

o If it doesn’t match, it’s a cache miss, and the processor fetches the data from main memory.

 Structure: This map is just a list of memory addresses.

2. Data Section

 What it does: This is where the actual data is stored in the cache.

 How it works:

o Data is stored in chunks called cache lines. Each cache line holds several pieces of data (e.g., 4 small
blocks of 32 bits each).

o The size of the cache depends on how much data it can hold, excluding the map and extra
information.

 Example: If your cache has 256 lines and each line holds 4 data blocks, the cache can store 256 × 4 = 1024
blocks of data.

3. Status Information

 What it does: Tracks whether the stored data is valid or has been changed.

 How it works:

o Valid Bit: Shows if the data is up-to-date and ready to use.

o Dirty Bit: Shows if the data has been modified. If yes, the cache needs to update the main memory
before discarding or replacing it.

How Cache Memory Works

 Cache Controller: This is like a manager that decides how to handle data requests.

o When the processor asks for data, the controller checks the directory store for a match.

o If there’s a match (cache hit), it delivers the data from the cache.
o If not (cache miss), it gets the data from main memory, stores it in the cache, and then gives it to the
processor.

Example: A 4 KB Cache

Imagine a 4 KB cache with the following setup:

1. 256 cache lines.

2. Each cache line holds 4 blocks of data, with each block being 32 bits (4 bytes).

o So, one cache line can hold 4 × 32 bits = 128 bits (16 bytes).

3. Total cache size = 256 lines × 16 bytes per line = 4 KB.

In addition to this data, each cache line has an address (cache-tag) and status bits (like valid and dirty).

Summary

Cache memory uses:

 A map (directory store) to track data locations.

 A data section to store actual data chunks.

 Status bits to manage the freshness and changes of data.

This setup ensures the processor can access data faster by reducing trips to slower main memory.

You might also like