(2) Storage Heirarchy

  • caching


--->>In computer science, a cache (pronounced /kæʃ/) is a collection of data duplicating original values stored elsewhere or computed earlier, where the original data is expensive to fetch (owing to longer access time) or to compute, compared to the cost of reading the cache. In other words, a cache is a temporary storage area where frequently accessed data can be stored for rapid access. Once the data is stored in the cache, it can be used in the future by accessing the cached copy rather than re-fetching or recomputing the original data.
A cache has proven to be extremely effective in many areas of computing because access patterns in typical
computer applications have locality of reference. There are several kinds of locality, but this article primarily deals with data that are accessed close together in time (temporal locality). The data might or might not be located physically close to each other (spatial locality).



  • coherency and consistency


In computing, cache coherence (also cache coherency) refers to the integrity of data stored in local caches of a shared resource. Cache coherence is a special case of memory coherence.
When clients in a system maintain
caches of a common memory resource, problems may arise with inconsistent data. This is particularly true of CPUs in a multiprocessing system. Referring to the "Multiple Caches of Shared Resource" figure, if the top client has a copy of a memory block from a previous read and the bottom client changes that memory block, the top client could be left with an invalid cache of memory without any notification of the change. Cache coherence is intended to manage such conflicts and maintain consistency between cache and memory.

cache consistency

-->>The synchronisation of data in multiple caches such that reading a memory location via any cache will return the most recent data written to that location via any (other) cache.Some parallel processors do not cache accesses to shared memory to avoid the issue of cache coherency. If caches are used with shared memory then some system is required to detect when data in one processor's cache should be discarded or replaced because another processor has updated that memory location. Several such schemes have been devised.


Cache Consistency Mechanisms There are several ways to maintain cache consistency of files placed in a cache: time-to-live fields, active invalidation protocols, and client polling. Time-to-live fields are efficient to set up, but are inaccurate.