0% found this document useful (0 votes)
370 views

Presentation On Cache Memory Operating System CSE 309 1

The document discusses a presentation on cache memory given by a group of students. The presentation covers topics such as the location of cache memory between the CPU and main memory, different types of cache memory including direct-mapped, fully associative, and K-way set associative caches. It also discusses cache mapping techniques, locality, applications of cache memory, and ways to improve cache performance.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
370 views

Presentation On Cache Memory Operating System CSE 309 1

The document discusses a presentation on cache memory given by a group of students. The presentation covers topics such as the location of cache memory between the CPU and main memory, different types of cache memory including direct-mapped, fully associative, and K-way set associative caches. It also discusses cache mapping techniques, locality, applications of cache memory, and ways to improve cache performance.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

1

09/13/2020
Presentation on
Cache memory
2
Group members:

09/13/2020
1. Ishrakh Bari Shuvo – 181002027
2. Fatema Jannat Nisha – 181002025
3. Sowpna Akter Mim – 181002067
4. Sabikun Nahar Tanha – 181002068
5. Abu Rayhan Prince – 181002198
3
Content:

09/13/2020
1. Introduction to cache memory
2. Location of cache memory
3. Types of cache memory
4. Locality
5. Cache mapping
6. Example of cache memory
7. Applications of cache memory
8. Cache performance
9. Pros & cons of cache memory
Cache memory 4

09/13/2020
Cache memory: cache memory is a small-sized type of volatile computer memory that
provides high-speed data access to a processor and stores frequently used computer
programs, applications and data.
Cache memory acts as an buffer between ram and cpu, which provides faster service.

Fig: Cache Memory Position


Location of Cache memory 5

09/13/2020
1. Cache memory lies between CPU and main memory.
2. Data is transferred in the form of words between the
cache memory and the CPU.
3. Data is transferred in the form of blocks or pages
between the cache memory and the main memory.

Figure: Location of Cache memory


Types of cache memory: 6

09/13/2020
There are three types of cache memory:
1) direct-mapped cache
2) fully associative cache
3) K-way-set-associative cache
Types of cache memory: 7

1. Direct-Mapped Cache:

09/13/2020
Direct-mapped cache maps in single cache line of cache memory location. It can be used once per
address in a timely manner. This type of cache memory performance is lower than others.

2. Fully Associative Cache:


Each memory location in a fully organized cache can be cached to any cache line. This type of
memory significantly reduces the amount of cache-line missed, considered a complex type of cache
memory implementation.

3. K-Way-Set-Associative Cache:
K-way-set-associative cache, the most common cache implementation, memory address can be
stored in any n line of any cache.
8
Locality:

09/13/2020
It is based on the principle of locality of reference. There are two ways with which data or instruction
is fetched from main memory and get stored in cache memory. These two ways are the following:

1. Temporary locality
2. Spatial locality

1. Temporary locality:
Temporary locality refers to the reuse of certain data and / or resources in a relatively short period of
time.
2. Spatial locality:
Spatial locality refers to the use of data elements in relatively close storage locations.
Cache mapping 9
 Cache mapping –

09/13/2020
1. Direct mapping
2. Fully Associative Mapping
3. Set Associative Mapping / K-way Set Associative Mapping
Direct mapping 10

09/13/2020
 Simplest mapping technique - each block of main memory maps to only one
cache line
 if a block is in cache, it must be in one specific place
 Formula to map a memory block to a cache line:
 i= j mod c
 i=Cache Line Number
 j=Main Memory Block Number
 c=Number of Lines in Cache
 we divide the memory block by the number of cache lines and the remainder
is the cache line address
Direct Mapping with C=4
11
Block 0 w0,w1,w2,w3

09/13/2020
Block 1 w4,w5,w6,w7
Slot 0
Block 2 w8,w9,w10,w11
Slot 1
Slot 2 Block 3 w12,w13,w14,w15

Slot 3 Block 4 w16,w17,w18,w19

Block 5 w20,w21,w22,w23
Cache Memory
Block 6 w24,w25,w26,w27

0 Mod 4 = Reminder 0 Block 7 w28,w29,w30,w31


4 Mod 4 = Reminder 0
Main Memory
Direct Mapping pros & cons 12

09/13/2020
 Simple
 Inexpensive
 Fixed location for given block
 If a program accesses 2 blocks that map to the same line repeatedly, cache misses
are very high – condition called thrashing
Fully Associative Mapping 13
 A fully associative mapping scheme can overcome the problems of the direct

09/13/2020
mapping scheme .
Block 0

Slot 0 Block 1

Slot 1 Block 2
Block can map to any slot
Block 3 Main
Tag used to identify which block is in which Slot 2
slot Memory
All slots searched in parallel for target Slot 3 Block 4

Block 5
Cache Memory
Block 6
Block 7
K-way Set Associative Mapping 14

09/13/2020
 Two-way set associative gives much better performance than direct mapping
 Just one extra slot avoids the thrashing problem
 Four-way set associative gives only slightly better performance over two-way
 Further increases in the size of the set has little effect other than increased cost of
the hardware!
 To compute cache set number:
 Set Num = j mod v
15
j = main memory block number

09/13/2020
v = number of sets in cache
Block 0
Block 1

Set 0 Slot 0 Block 2

Slot 1 Block 3
Set 1 Slot 2 Block 4

Slot 3 Block 5

cache Main Memory


Example of Cache 16

09/13/2020
Imagine you’re doing research at a library. You need to refer to
books, but there’s really only a handful of books that are useful
for the research you’re doing right now. Then what’s the solution.

For web services, the vast amount of data is the world wide web
and the small amount you keep nearby is stored in your computer
as a cache.
Application of Cache Memory 17

09/13/2020
Usually, the cache memory can store a reasonable number of blocks at any given
time, but this number is small compared to the total number of blocks in the main
memory.

The correspondence between the main memory blocks and those in the cache is
specified by a mapping function.
Cache Performance 18

09/13/2020
 When the processor needs to read or write a location in main memory, it first
checks for a corresponding entry in the cache.

 The performance of cache memory is frequently measured in terms of a quantity


called Hit ratio.
Hit ratio= Cache hit/( Cache hit + Cache miss)
= no. of hits/ total accesses
Improve Cache Performance 19

09/13/2020
 We can improve Cache performance using higher cache block size.

 We can improve Cache performance using higher associativity.

 Reducing the miss rate.

 Reducing the miss penalty.

 Reduce the time to hit in cache.


Pros. & Cons. of Cache Memory 20

09/13/2020
Pros. of Cache: Cons. Of Cache:
 Cache memory is faster than main  Cache memory has limited capacity.
memory.  Cache works only for cache-friendly
 It consumes less access time as data, not for data structures that hang
compared to main memory. together via pointers, such as a list.
 It stores the program that can be  It is more expensive.
executed within a short period of
time.
 It stores data for temporary use.
21

09/13/2020
Thank You

You might also like