We are using spring-cache without any specific provider. The reason is - our cache usage is very lightweight and simple - we intend to store no more than 100 records in it.

However, the only thing that we need, is to be sure, that our cache does not grow unlimitedly in runtime to not get a huge memory footprint and end up with OOM. IMHO It will be unnecessary to bring another caching framework - it's too heavy for such a small task.

I think that would be nice if Spring will provide an implementation, where the cache store N key/value pairs at max. Then it will start to remove old mappings to not consume more memory at runtime. That is exactly what we need. We can, of course, create some dummy methods like:

@CacheEvict(cacheNames = "myCache")
public void removeFromCache(String txnId) {
}

But it seems to be just a workaround.

In the out-of-the-box implementation - ConcurrentMapCache - there is no such thing. Again, we can introduce Caffeine or EhCache but it would be nice to just have this feature in place, so we do not have to integrate another library.

Comment From: bclozel

Thanks for raising this issue. The ConcurrentMapCache implementation is designed for "testing or simple caching scenarios" only. If additional behavior is needed, other cache providers are usually the best solution. In your case, Caffeine seems to be a very good choice, as you can tailor the cache behavior for exactly what you need.

We could introduce a new cache implementation that extends AbstractValueAdaptingCache and that is backed by Spring's ConcurrentLruCache. The main problem here is that as soon as you consider the LRU case, there are other things you probably want to consider: evicting based on least frequently used instead, sized based eviction, time based eviction, pinning particular entries, etc.

In your case, if you really want to avoid adding another library to your application, I would suggest trying to implement the ConcurrentLruCache variant (by extending AbstractValueAdaptingCache) and reporting back to us with your experience. I suspect that Caffeine, even for small caches, is still very efficient and would provide the best value. I'm leaving this issue opened for now for other team members to comment, but I think that we probably don't want to expand our footprint here and that we favor cache providers.

Comment From: snicoll

I think that would be nice if Spring will provide an implementation, where the cache store N key/value pairs at max. Then it will start to remove old mappings to not consume more memory at runtime. That is exactly what we need.

I understand that but ConcurrentMapCache is by design very simple. Adding additional features, such as the one you're mentioning, is a step too far as anyone else could legitimately ask us to support "the only thing that they need". I don't think there's anything wrong with adding a cache library for such a use case. And if you don't want to do that, then crafting your own based on what already exists is our recommendation.