What is Cache Memory. Types and functions of
cache memory?
Cache
Memory
Cache Memory
The Cache Memory is the Memory which is
very near to the central processing unit , all the Fresh Commands are
Stored into the Cache Memory. The Cache Memory is committed for storing the
input which is given by the user and which is essential for the CPU to
Implement a Task. But the Capacity of the Cache Memory is too small in compare
to Memory and Hard Disk.
Prominence of Cache memory
The cache memory lies in the path between
the processor and the memory. The cache memory consequently, has lesser access
time than memory and is faster than the main memory. A cache memory have an
access time of 100ns, while the main memory may have an access time of 700ns.
The
cache memory is very expensive and hence is limited in capacity. Earlier cache
memories were available separately but the microprocessors contain the cache
memory on the chip itself.
The need for the cache memory is due to
the mismatch between the speeds of the main memory and the CPU. The CPU clock
as discussed earlier is very fast, whereas the main memory access time is
comparatively slower. Hence, no matter how fast the processor is, the
processing speed depends more on the speed of the main memory (the strength of
a chain is the strength of its weakest link). It is because of this reason that
a cache memory having access time closer to the processor speed is introduced.
The cache memory stores the program (or
its part) currently being executed or which may be executed within a short
period of time. The cache memory also stores temporary data that the CPU may
frequently require for manipulation.
The cache memory works according to
various algorithms, which decide what information it has to store.
These algorithms work out the probability to decide which data would be most
frequently needed. This probability is worked out on the basis of past observations.
It acts as a high speed buffer between CPU
and main memory and is used to temporary store very active data and action
during processing since the cache memory is faster than main memory, the
processing speed is increased by making the data and instructions needed in
current processing available in cache.
A computer can have different levels and
sizes of cache depending on the CPU architecture. The most common levels of
cache are L1 and L2 cache, where L1 is closest to the Cpu and hence its access
time is much faster compared to L2 cache , the sizes of these caches can vary
from 8Kb - 128kb or even 256Kb in modern systems.
Another mode of caching is Disk
Caching
Disk caching works under the same
principle as memory caching, but instead of using high-speed SRAM, a it uses
conventional main memory. The most recently accessed data from the hard drive
(as well as adjacent sectors) is saved in a memory buffer. When a program needs
to access data from the disk, it first checks the disk cache to see if the data
is there. Disk caching can dramatically improve the performance of software. Because
accessing a byte of data in RAM can be thousands of times faster than accessing
a byte on a hard disk.
The effectiveness of any cache is
determined by its hit rate. The higher the hit rate the better the performance
of cpu and the better the caching algorithm implemented.
Two most common algorithm for caching are
· Least Frequently Used (LFU): This cache
algorithm uses a counter to keep track of how often an entry is accessed.
· Least Recently Used (LRU): This cache algorithm
keeps recently used items near the top of cache.
Recall: A lower size cache with high
hit ratio, is far better than a large size cache with poor hit ratio.
A CPU cache is a cache used by the central
processing unit (CPU) of a computer to reduce the average time to access data
from the main memory. The cache is a smaller, faster memory which stores copies
of the data from frequently used main memory locations. Most CPUs have
different independent caches, including instruction and data caches, where the
data cache is usually organized as a hierarchy of more cache levels (L1, L2,
etc.).
· Least Frequently Used (LFU): This cache
algorithm uses a counter to keep track of how often an entry is accessed.
0 Comments