We all have been there. 'If only I have a cache, I wouldn't have to make the users wait for a few seconds while I request this data from network'

Android actually provides ways to gracefully handle this through various options to persist your data.

On top of those it also provides us with an LruCache.

This post explains a cache that leverages that to add time as one of the factors into it while storing data. While building the Xero Me app, due to the sensitive nature of information we display, we are required to authenticate users before displaying them any data. For security reasons we do not persist any user data on the device in the form of databases/internal storage.

However while using the app, the user switches between different screens and also ends up revisiting some of the screens. During our first iteration of the app, we used to just load the data as and when users reenter the screen. That while getting the job done, also means that there is a lag or a loading screen a little too often while we load the data.

Doesn't Android provide LruCache? Of course, it does. LruCache works well for what it has to do but I wanted the resources to expire if they have been fetched past a certain time to avoid showing stale data. The time that defines a stale data is upto the requirements and request at hand. That is the reason behind coming up with this TimedLruCache which in fact is just a wrapper around the LruCache.

It should be pretty straightforward and so I am just going to put that implementation here.

public class TimedCache<K, V> {  
    private LruCache mLruCache;
    private long mExpiryTimeInMillis;
    private Map<K, Long> mTimeMap;

     * @param maxSize for caches that do not override sizeOf(K, V), this is the maximum number of
     * entries in the cache. For all other caches, this is the maximum sum of the
     * sizes of the entries in this cache.
     * @param expiryTimeInMillis the period after which the entries in the will be assumed
     * as have been expired and will not be returned.
    public TimedCache(int maxSize, long expiryTimeInMillis) {
        mLruCache = new LruCache<K, V>(maxSize);
        mTimeMap = new ConcurrentHashMap<>();
        mExpiryTimeInMillis = expiryTimeInMillis;

    private boolean isValidKey(K key) {
        return key != null && mTimeMap.containsKey(key);

    public synchronized V get(K key) {
        return getIfNotExpired(key, System.currentTimeMillis() - mExpiryTimeInMillis);

    public synchronized V getIfNotExpired(K key, long expiryTimeInMillis) {
        if (!isValidKey(key)) {
            return null;
        if (mTimeMap.get(key) >= expiryTimeInMillis) {
            return (V) mLruCache.get(key);
        } else {
            return null;

    public synchronized void put(K key, V value) {
        if (key != null && value != null) {
            mLruCache.put(key, value);
            mTimeMap.put(key, System.currentTimeMillis());

    public synchronized void remove(K key) {
        if (key != null) {

    public synchronized void clear() {

Now that we have that, all we have to do to use it is to instantiate it like we do with LruCache and use it to store and retrieve data.

TimedCache mTimedCache = new TimedCache<String, Object>(50, DEFAULT_CACHE_TIME_IN_MILLIS);

// Put user
mTimedCache.put(XERO_USER, user);

// Retrieve user
User user = (User) mTimedCache.getIfNotExpired(XERO_USER, 15 * 60 * 1000)  

If you ever want to remove the data, you just do it by passing the key.


Clearing the cache:


The use of this cache in Xero Me application has improved the performance and experience of using the app by a lot. To give a basic idea of how it is being used in the app, we have a DataManager that all the components talk to to ask for data. The DataManager checks if the data is available in the cache and if so returns it or fetches the data from server, caches it and then returns it to the client. Of course all data is not cacheable but that it upto the DataManager to decide which in itself might require another small post.

Tags: Cache, TimedCache, DataManager