public class ConcurrentLRUCache<K,V>
A LRU cache implementation based upon ConcurrentHashMap and other techniques to reduce
contention and synchronization overhead to utilize multiple CPU cores more effectively.
Note that the implementation does not follow a true LRU (least-recently-used) eviction
strategy. Instead it strives to remove least recently used items but when the initial
cleanup does not remove enough items to reach the 'acceptableWaterMark' limit, it can
remove more items forcefully regardless of access order.
public ConcurrentLRUCache(int upperWaterMark,
public ConcurrentLRUCache(int size,
Returns 'n' number of oldest accessed entries present in this cache.
This uses a TreeSet to collect the 'n' oldest items ordered by ascending last access time
and returns a LinkedHashMap containing 'n' or less than 'n' entries.
n - the number of oldest items needed
a LinkedHashMap containing 'n' or less than 'n' entries