Monday, 27 March 2017

Cache Mapping Technique


Cache Mapping Technique

Cache is small, fast and expensive memory.
This is also use for reducing access time.
Cache is more closer to CPU than main memory.
Cache is use to store addresses.
There are three ways for mapping cache memory.
1.Direct mapping
2.Fully Associative mapping
3.Set-Associative mapping

These method are use for assigning addresses to cache locations.

1.Direct mapping
Direct mapping is the easiest.
It is often use for instruction cache.
In direct mapping no search is needed.
But in this method the cache memory is not fully utilized.
No replacement technique is required in this method of cache mapping.
In this method only one tag compare is require per access.
All words stored in cache must have different indices.

2. Fully Associative mapping
In this type of mapping the tag memory is searched in parallel (Associative) hence it is called Associative mapping.
In this method cache utilization is very high as compare to other two technique.
In this method main memory divided into two groups lower order bits for location of a word in block and higher order bits for blocks.
It is expensive to implement because of cache searching.
As much comparators as number of tags is required means large number of comparators is required.
The internal logic compare the incoming address with all the stored addresses.

3. Set-Associative mapping
Set-Associative mapping have good performance but it is complex.
Set-Associative mapping is the combination of both direct mapping and fully associative mapping.
In this method the cache is divided into sets.
So the search is performed over a cache set in parallel.
Set-Associative mapping is use in microprocessors.
It allows limited number of blocks with same index and different tags.
The incoming tag is compared with all tags of selected sets with the help of comparator.

                                                            WATCH THIS VIDEO


VISIT AT YOUTUBE


BUY BOOKS AT EEE STORE