0% found this document useful (0 votes)
0 views

ACA-Lecture 14

The document discusses the concept of memory hierarchy in modern computer systems, which organizes memory types based on speed, cost, and size. It details various levels of memory, including registers, cache, RAM, and storage types, along with their characteristics and access patterns. Cache memory is emphasized for its role in improving CPU performance by storing frequently accessed data, with different mapping techniques and the implications of cache hits and misses explained.

Uploaded by

RCM For all
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

ACA-Lecture 14

The document discusses the concept of memory hierarchy in modern computer systems, which organizes memory types based on speed, cost, and size. It details various levels of memory, including registers, cache, RAM, and storage types, along with their characteristics and access patterns. Cache memory is emphasized for its role in improving CPU performance by storing frequently accessed data, with different mapping techniques and the implications of cache hits and misses explained.

Uploaded by

RCM For all
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Lecture 7: Memory Hierarchy and Cache Memory

1. Introduction

Modern computer systems use different types of memory, organized in a hierarchy to


balance speed, size, and cost.

2. What is Memory Hierarchy?

Memory Hierarchy is a structure that organizes memory types in levels based on:

• Speed (fastest at the top)


• Cost (cheapest at the bottom)
• Size (smallest at the top)

3. Levels of Memory Hierarchy

Level Type Speed Cost Size


1 Registers Fastest Very High Very Small
2 Cache (L1, L2, L3) Fast High Small
3 Main Memory (RAM) Medium Moderate Medium
4 Secondary Storage Slow Low Large
5 Tertiary Storage Very Slow Very Low Very Large

4. Why Use a Hierarchy?

• Fast memory is expensive and small


• Large memory is cheap but slow
• The hierarchy gives a balance between performance and cost

5. CPU Access Pattern

• CPU first checks registers


• Then cache
• Then RAM
• Finally, hard disk if data is not found
This reduces average memory access time

6. What is Cache Memory?

Cache is a small, high-speed memory placed between the CPU and RAM.
It stores frequently accessed data and instructions.

7. Types of Cache

• L1 Cache:
o Closest to CPU
o Smallest and fastest
• L2 Cache:
o Larger but slower than L1
o Often shared among cores
• L3 Cache:
o Even larger
o Shared among all CPU cores

8. Cache Mapping Techniques

a) Direct Mapping

• Each memory block maps to one specific cache line


• Simple but can cause collisions

b) Associative Mapping

• Any memory block can go anywhere in the cache


• More flexible but slower

c) Set-Associative Mapping

• Combines both techniques


• Divides cache into sets; blocks can go into any line in a set

9. Cache Hit and Miss


• Cache Hit:
Data is found in the cache
→ Fast access
• Cache Miss:
Data is not found in cache
→ Slower access from RAM or disk

10. Real-Life Analogy

Think of cache as your shirt pocket:

• You keep items you need frequently (pen, phone)


• You don’t go to your backpack every time

11. Memory Hierarchy Diagram

Characteristics of Memory Hierarchy


• Capacity: It is the global volume of information the memory can store. As
we move from top to bottom in the Hierarchy, the capacity increases.
• Access Time: It is the time interval between the read/write request and the
availability of the data. As we move from top to bottom in the Hierarchy,
the access time increases.
• Performance: The Memory Hierarch design ensures that frequently
accessed data is stored in faster memory to improve system performance.
• Cost Per Bit: As we move from bottom to top in the Hierarchy, the cost per
bit increases i.e. Internal Memory is costlier than External Memory.
12. Advantages of Using Cache

• Faster data access


• Reduces load on RAM
• Improves CPU performance

13. Class Activity

Ask students:

• What happens if the CPU doesn't find data in the cache?


• Can we put all data in cache? Why or why not?

You might also like