(Computer Awareness) Addressing modes (091)
(Computer Awareness) Addressing modes (091)
MEMORY ORGANIZATION
Addressing Modes
• The term addressing modes refers to the way in which the operand of an instruction is specified. The
addressing mode specifies a rule for interpreting or modifying the address field of the instruction before
the operand is actually executed.
• Addressing modes for 8086 instructions are divided into two categories:
• Addressing modes for data
• Addressing modes for branch
• The 8086 memory addressing modes provide flexible access to memory, allowing you to easily access
variables, arrays, records, pointers, and other complex data types.
• The key to good assembly language programming is the proper use of memory addressing modes.
• An assembly language program instruction consists of two parts : Opcode and Operand
IMPORTANT TERMS
• Starting address of memory segment.
• Effective address or Offset: An offset is determined by adding any combination of three address
elements: displacement, base and index.
• Displacement: It is an 8 bit or 16 bit immediate value given in the instruction.
• Base: Contents of base register, BX or BP.
• Index: Content of index register SI or DI.
According to different ways of specifying an operand by 8086 microprocessor, different addressing modes
are used by 8086.
• The different ways of specifying the location of an operand in an instruction are called as addressing
modes.
AC ← AC + [[X]]
Register Direct Addressing Mode –
• In this addressing mode, the operand is contained in a register set.
• The address field of the instruction refers to a CPU register that contains the operand.
• No reference to memory is required to fetch the operand.
AC ← AC + [[R]]
• Volatile Memory: This loses its data, when power is switched off.
• Non-Volatile Memory: This is a permanent storage and does not lose any data when power is
switched off.
Memory Hierarchy
• The total memory capacity of a computer can be visualized by hierarchy of components. The memory
hierarchy system consists of all storage devices contained in a computer system from the slow
Auxiliary Memory to fast Main Memory and to smaller Cache memory.
• Auxiliary memory access time is generally 1000 times that of the main memory, hence it is at the
bottom of the hierarchy.
• The main memory occupies the central position because it is equipped to communicate directly with
the CPU and with auxiliary memory devices through Input/output processor (I/O).
• When the program not residing in main memory is needed by the CPU, they are brought in from
auxiliary memory. Programs not currently needed in main memory are transferred into auxiliary
memory to provide space in main memory for other programs that are currently in use.
• The cache memory is used to store program data which is currently being executed in the CPU.
Approximate access time ratio between cache memory and main memory is about 1 to 7~10
Memory Access Methods
Each memory type, is a collection of numerous memory locations. To access data from any memory, first
it must be located and then the data is read from the memory location. Following are the methods to
access information from memory locations:
• Random Access: Main memories are random access memories, in which each memory location has a
unique address. Using this unique address any memory location can be reached in the same amount of
time in any order.
• Sequential Access: This methods allows memory access in a sequence or in order.
• Direct Access: In this mode, information is stored in tracks, with each track having a separate
read/write head.
Main Memory
• The memory unit that communicates directly within the CPU, Auxillary memory and Cache memory, is called
main memory.
• It is the central storage unit of the computer system.
• It is a large and fast memory used to store data during computer operations.
• Main memory is made up of RAM and ROM, with RAM integrated circuit chips holing the major share.
Cache Memory –
• The data or contents of the main memory that are used again and again by CPU, are stored in the cache memory so
that we can easily access that data in shorter time.
• Whenever the CPU needs to access memory, it first checks the cache memory. If the data is not found in cache
memory, then the CPU moves onto the main memory. It also transfers block of recent data into the cache and keeps
on deleting the old data in cache to accommodate the new one.
Hit Ratio -
• The performance of cache memory is measured in terms of a quantity called hit ratio. When the CPU refers to
memory and finds the word in cache it is said to produce a hit. If the word is not found in cache, it is in main
memory then it counts as a miss.
• The ratio of the number of hits to the total CPU references to memory is called hit ratio.
• Hit Ratio = Hit/(Hit + Miss)
Associative Memory –
• It is also known as content addressable memory (CAM).
• It is a memory chip in which each bit position can be compared.
• In this the content is compared in each bit cell which allows very fast table lookup. Since the entire chip can be
compared, contents are randomly stored without considering addressing scheme.
• These chips have less storage capacity than regular memory chips.
Virtual Memory
• Virtual memory is the separation of logical memory from physical memory. This separation provides large virtual
memory for programmers when only small physical memory is available.
• Virtual memory is used to give programmers the illusion that they have a very large memory even though the
computer has a small main memory.
• It makes the task of programming easier because the programmer no longer needs to worry about the amount of
physical memory available.
QUIZ
Answer: A
QUIZ
Answer: b
QUIZ
Answer: A
QUIZ
Answer: B
THANK YOU!