Cache Memory with its Mapping

Cache Memory in Computer Organization
  • Cache memory is a sort of volatile computer memory that stores frequently used computer programmes, applications, and data and offers high-speed data access to a CPU.
  • The usage of cache memory reduces the average time it takes to access data from the main memory. 
  • The cache is a more compact and speedier memory that stores copies of data from frequently accessed main memory locations. In a CPU, there are several separate independent caches that store instructions and data.
  • Cache is a type of temporary memory storage that enables data retrieval simpler and more efficient. It is the computer's quickest memory, and it is usually built into the motherboard and directly into the CPU or primary random access memory (RAM).
  • The quickest memory accessible is cache memory, which functions as a buffer between RAM and the CPU. Every time the processor wants to read or write a location, it checks if a corresponding item is accessible in the cache, minimizing the time it takes to retrieve data from the main memory.
  • A physical component of the CPU, hardware cache is also known as processor cache. Primary or secondary cache memory can be primary or secondary cache memory, depending on how near it is to the CPU core. Primary cache memory is directly integrated into (or closest to) the processor.
  • The speed of the cache is determined by its closeness as well as its size. The more data that can be kept in the cache, the faster it runs, therefore chips with less storage capacity, even though they're closer to the CPU, are slower.
Types of Cache:
  1. Primary Cache - The CPU chip always has a primary cache. This cache is modest, and the time it takes to access it is equivalent to the time it takes to access processor registers.
  2. Secondary Cache - The secondary cache sits between the primary cache and the rest of the memory. The level 2 (L2) cache is what it's called. The Level 2 cache is frequently located on the CPU chip.
Levels of Cache memory
Register or Level 1.
  • It's a form of memory that stores and accepts data that's immediately stored in the CPU. Accumulator, programme counter, and address register are some of the most often used registers.
Level 2 or Cache memory.
  • It is the fastest memory, with the quickest access time, where data is temporarily stored for quicker access.
Level 3 or Main memory.
  • It is memory that the computer is now operating on. It is modest in size, and data is lost when the power is turned off.
Level 4 or Secondary memory.
  • It's external memory, which isn't as quick as main memory but stores data indefinitely.
Cache Performance:
  • When the processor wants to read or write a place in main memory, it first looks in the cache for a corresponding entry.
  • A cache hit occurs when the CPU discovers that the memory location is in the cache, and data is read from the cache.
  • A cache miss occurs when the CPU cannot locate the memory location in the cache. When a cache miss occurs, the cache creates a new entry and transfers data from main memory, after which the request is completed using the cache's contents.
  • Cache memory performance is commonly quantified in terms of a metric known as Hit ratio.
Hit Ratio:

  • Hit ratio is the time it takes to determine if the required word is in the Cache Memory or not. (हिट अनुपात वह समय है जो यह निर्धारित करने में लगता है कि आवश्यक शब्द कैश मेमोरी में है या नहीं।).
  • A hit ratio is calculated by comparing the number of cache hits to the total number of content requests received.
  • A miss ratio is the inverse of this, in which the number of cache misses is computed and compared to the total number of content requests.
Hit Ratio
Cache memory Mapping
  • In Cache memory, data is transferred as a block from primary memory to cache memory. This process is known as Cache Mapping.
  • There are three types of cache mapping:
  1. Direct mapping.
  2. Associative mapping.
  3. Set-associative mapping.
1. Direct Mapping.
  • Instead of keeping the entire address information with the data in a direct mapping cache, just a portion of the address bits are saved with the data.
  • According to the mapping rule for direct mapping, new data must be stored exclusively in a certain cache location. As a result, no replacement algorithm is required.
Direct mapping's benefits
  • The simplest sort of cache memory mapping is direct mapping.
  • Only the tag field must match when searching for a word, which is why it is the quickest cache.
  • When compared to associative cache mapping, direct mapping cache is less costly.
Direct mapping's drawbacks.
  • The performance of the direct mapping cache is poor since it involves data-tag value replacement.
2. Associative Mapping.
  • Both the address and data of the memory word are kept in associative mapping.
  • Cache memory uses an associative mapping mechanism that is both versatile and quick.
  • Fully associative cache is another name for this mapping mechanism.
Associative mapping has a number of advantages.
  • Associative mapping is a quick method.
  • Associative mapping is simple to use.
Associative mapping's drawbacks.
  • Implementing associative mapping in cache memory is costly since it necessitates storing the address along with the data.
3. Set-Associative Mapping.
  • Two or more words can be stored under the same index location in Set-Associative cache memory.
  • Every data word is saved here, along with its associated tag. A text is defined as the number of tag-data terms beneath an index.
Set-Associative Mapping's Benefits
  • When compared to the two cache memory types covered previously, Set-Associative cache memory has the highest hit-ratio. As a result, its performance is significantly improved.
Set-Associative Mapping's Drawbacks
  • Cache memory that is set-associative is quite costly. As the size of the set grows, so does the price.
Tips to Learn


Post a Comment

Previous Post Next Post