The first part is analyzing cache performance. The second part is on Amdahl’s law.
Expected running time behavior under three operating conditions
- If current data set fits in cache:
- All cache hits
- If current data set is slightly too big to fit in the cache
- Most data access hit the cache, some miss
- slightly longer average data access time
- If current data set is much too big to fit in the cache
- Add data accesses miss the cache
- much longer data-access time
For a program, the speed up achieved through parallelization is limited by the portion of the application that is serial.
It means ‘diminishing returns” adding more processors leads to successively smaller improvement in speedup.