Memory Organization: by Saniya Mhatre
Memory Organization: by Saniya Mhatre
Organization
Memory organization is the fundamental
structure that enables computers to efficiently
store and retrieve data. Understanding this
hierarchy is crucial for optimizing system
performance and making the most of available
resources.
by Saniya Mhatre
Memory Interleaving
• An arithmetic pipeline usually requires two or more
operands to enter the pipeline at the same time.
• Instead of using two memory buses for simultaneous
access, the memory can be partitioned into a number
of modules connected to a common memory address
and data buses.
• A memory module is a memory array together with its
own address and data registers.
Hierarchical Memory Organization
Registers Cache Main Memory
Fastest, but smallest. Used by Intermediate storage that bridges Larger but slower storage that
the processor for immediate data the gap between the processor holds the bulk of the program
access. and main memory. and data.
Cache Memory
• The cache is a small and very fast memory, interposed between
the processor and the main memory.
• Its purpose is to make the main memory appear to the processor
to be much faster than it actually is.
• The cache memory can store a reasonable number of blocks at
any given time, but this number is small compared to the total
number of blocks in the main memory.
• The correspondence between the main memory blocks and those
in the cache is specified by a mapping function.
Cache hits
• The processor does not need to know explicitly about
the existence of the cache.
• It simply issues Read andWrite requests using
addresses that refer to locations in the memory. The
cache control circuitry determines whether the requested
word currently exists in the cache.
• If it does, the Read orWrite operation is performed on This Photo by Unknown Author is licensed under CC BY-SA
Set Associative
A compromise between direct and fully associative, offering balanced performance.
Replacement Algorithms
When the cache is full, the replacement policy determines which data to evict to make
room for new data. Common policies include Least Recently Used (LRU), First-In
First-Out (FIFO), and random replacement. The goal is to maximize cache hit rates and
overall system performance.