Computer Architecture – Cache Analysis and Amdahl’s Law

The first part is analyzing cache performance. The second part is on Amdahl’s law.

Expected running time behavior under three operating conditions

  • If current data set fits in cache:
    • All cache hits
  • If current data set is slightly too big to fit in the cache
    • Most data access hit the cache, some miss
    • slightly longer average data access time
  • If current data set is much too big to fit in the cache
    • Add data accesses miss the cache
    • much longer data-access time

Amdahl’s Law

For a program, the speed up achieved through parallelization is limited by the portion of the application that is serial. 

It means ‘diminishing returns” adding more processors leads to successively smaller improvement in speedup.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s