Cache memory: Difference between revisions

From Computer Science Wiki
(Created page with "frame|right|This is a basic concept in computer science The primary memory stores the program instructions and the data in binary machine code. The Contro...")
 
No edit summary
 
(12 intermediate revisions by 2 users not shown)
Line 1: Line 1:
[[File:binary.png|frame|right|This is a basic concept in computer science]]
[[File:binary.png|frame|right|This is a basic concept in computer science]]


The primary memory stores the program instructions and the data in binary machine code. The Control Unit deals with the instructions and the Arithmetic and Logic unit handles calculations and comparisons with the data. Data and instructions are moved by buses.
Cache is very fast and small memory that is usually on the CPU.  


There are two types of memory in primary memory (also known as the Immediate Access Store ) of the computer, RAM and ROM<ref>http://www.ib-computing.net/program/core/memories.html</ref> :
# Cache memory can be accessed/is faster than RAM;
# It is used to hold common/expected/frequently used data/operations;
# It is closer to CPU than RAM/situated between RAM and CPU/on same board as CPU/ with faster read/write speed;


* RAM is Random Access Memory which loses its contents when the computer is switched off (it is volatile ). This memory can be written to, instructions and data can be loaded into it.


* ROM , or Read Only Memory is non-volatile and is used to store programs permanently (the start-up or " boot " instructions, for example), the computer cannot store anything in this type of memory.
Cache memory is used to reduce the average memory access times. This is done by storing the data that is frequently accessed in main memory addresses therefore allowing the CPU to access the data faster. This is due to the fact that cache memory can be read a lot faster than main memory. There are different types of cache (e.g. L1,L2 and L3).  Reference from https://compsci2014.wikispaces.com/2.1.3+Explain+the+use+of+cache+memory which is now kaput.  


When the programs and data files (known as the software ) are not in RAM, they are stored on [[secondary memory]] (also known as backing store ) such as tapes or discs. The tape or disc drives and any input and output devices connected to the CPU are known collectively as peripherals.


'''RAM – Random Access Memory'''<ref>http://dis-dpcs.wikispaces.com/2.1.2+Primary+Memory+and+2.1.3+Cache+Memory</ref>
'''The steps to access the data from cache memory are:'''
* The RAM is where programs, instructions, values are stored at runtime
* The RAM can be accessed a lot faster than the hard disk
* All information is lost when the power is turned off
* Allows storage and random access of the data
* Each program instruction & piece of data in the RAM has a unique address


'''ROM – Read Only Memory'''
* A request is made by the CPU
* Can not be written to easily or at all (often times once only, then never again)
* Cache is checked for data
* In modern PCs usually used for firmware in CPU, Graphics card, hard disks, etc.
* If the data is found in the cache it is returned to the CPU (this is called a cache hit)
* Many ROM modules are replaced with Flash modules nowadays (ex. BIOS, firmware modules) to allow easier updating
* If the data is not found in the cache then the data will be returned from the main memory.
* In devices like keyboards, etc. which aren’t designed for firmware updates ROM might still be used
* Stores code for interrupter


There is a wonderful analogy I found [https://www.quora.com/Computer-Architecture-What-is-the-L1-L2-L3-cache-of-a-microprocessor-and-how-does-it-affect-the-performance-of-it here]. If you are confused about cache memory, I suggest you read the top part of this story.


'''Virtual memory'''
'''Cache memory is fast because:'''
To expand memory usable by the CPU it is possible to use something called virtual memory where the most important instructions for the program are stored in the RAM and the less necessary information is stored in secondary memory (usually the HDD) and then the data is switched between RAM and virtual memory as it is needed (to swap).


Advantages:
* In the case of a CPU cache, it is faster because it's on the same die as the processor. In other words, the requested data doesn't have to be bussed over to the processor; it's already there.
* More memory to work with.
* In the case of the cache on a hard drive, it's faster because it's in solid state memory, and not still on the rotating platters.
* In the case of the cache on a web site, it's faster because the data has already been retrieved from the database (which, in some cases, could be located anywhere in the world).


Disadvantages:
So it's about locality, mostly. Cache eliminates the data transfer step.
* It is very slow compared to the primary memory
* Thrashing can occur. Thrashing is a condition when there is too much data that needs to be swapped between RAM and virtual memory, and the computer's response time is compromised.


Locality is a fancy way of saying data that is "close together," either in time or space. Caching with a smaller, faster (but generally more expensive) memory works because typically a relatively small amount of the overall data is the data that is being accessed the most often.<ref>http://programmers.stackexchange.com/questions/234253/why-is-cpu-cache-memory-so-fast</ref>
DRAM today has a cycle time of around 70ns. Cache is on-die static RAM and has an access time of around 6ns.<ref>https://www.quora.com/What-is-the-difference-between-RAM-and-Cache-memory</ref>




== Do you understand this topic? ==  
<html>
<iframe width="560" height="315" src="https://www.youtube.com/embed/6JpLD3PUAZk" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</html>


* Describe primary memory.
== Standards ==
* Explain the use of cache memory.


== Do you have an advanced understanding about this topic? ==
* Explain why investing in primary memory may provide better performance than secondary memory.


== References ==
== References ==

Latest revision as of 08:08, 18 October 2021

This is a basic concept in computer science

Cache is very fast and small memory that is usually on the CPU.

  1. Cache memory can be accessed/is faster than RAM;
  2. It is used to hold common/expected/frequently used data/operations;
  3. It is closer to CPU than RAM/situated between RAM and CPU/on same board as CPU/ with faster read/write speed;


Cache memory is used to reduce the average memory access times. This is done by storing the data that is frequently accessed in main memory addresses therefore allowing the CPU to access the data faster. This is due to the fact that cache memory can be read a lot faster than main memory. There are different types of cache (e.g. L1,L2 and L3). Reference from https://compsci2014.wikispaces.com/2.1.3+Explain+the+use+of+cache+memory which is now kaput.


The steps to access the data from cache memory are:

  • A request is made by the CPU
  • Cache is checked for data
  • If the data is found in the cache it is returned to the CPU (this is called a cache hit)
  • If the data is not found in the cache then the data will be returned from the main memory.

There is a wonderful analogy I found here. If you are confused about cache memory, I suggest you read the top part of this story.

Cache memory is fast because:

  • In the case of a CPU cache, it is faster because it's on the same die as the processor. In other words, the requested data doesn't have to be bussed over to the processor; it's already there.
  • In the case of the cache on a hard drive, it's faster because it's in solid state memory, and not still on the rotating platters.
  • In the case of the cache on a web site, it's faster because the data has already been retrieved from the database (which, in some cases, could be located anywhere in the world).

So it's about locality, mostly. Cache eliminates the data transfer step.

Locality is a fancy way of saying data that is "close together," either in time or space. Caching with a smaller, faster (but generally more expensive) memory works because typically a relatively small amount of the overall data is the data that is being accessed the most often.[1]

DRAM today has a cycle time of around 70ns. Cache is on-die static RAM and has an access time of around 6ns.[2]


Standards[edit]

  • Explain the use of cache memory.


References[edit]