Can I rely on Redis for high performance look ups of 20+ items at a time?

Category: azure cache


imsam67 on Sat, 02 Jul 2016 18:25:10

I'm planning on relying on Redis cache for the following scenario and I want to make sure I'm making the right decision.

Our application uses DocumentDb and Azure Table Storage (ATS) as our primary databases. We also use Azure SQL Databases but NoSQL databases are our primary data source. With that said, there's a lot of denormalization in the data i.e. we save the same data again and again and rely on worker roles to keep data in sync.

We have, however, decided NOT to save some data multiple times if the data is prone to frequent changes. In that scenario, we're planning on relying on Redis cache.

Here's one example: we decided NOT to store multiple copies of user avatar data because people change their avatars somewhat frequently. Our solution is to simply read the necessary avatar information for users and keep them in Redis cache.

This means, when we get user comments -- which could be 20 or 30 items at a time -- we'll have to get those 20/30 users avatars from Redis. Is this going to have a noticeable impact on our response times?

Not sure, yet, how we would read those 20+ avatar data i.e. one query with 20+ conditions or multiple queries.

Application's responsiveness is super important for us and want to make sure that Redis is the right way to handle this scenario.

Thanks, Sam


tilovell09 on Thu, 07 Jul 2016 20:58:51

Hi Sam,
You could use REDIS cache and do MGET for this scenario, if you want a super-responsive to avatar changes app (instead of e.g. using a CDN to cache avatar resources, or caching them in user's browser) - in terms of resource usage that is probably the most efficient way to read back the avatars.

Impact on latency will depend a lot on the total number of bytes of the responses to your MGET requests, but also how high a % of max network bandwidth you are using [once you get up to 80-90% more and more requests have to be queued].

 So you might want to consider doing something like limiting requests to only 10 avatars at a time (e.g. loading them in bunches when needed, as you scroll down the page?), and setting limits on the size of avatar images that you will store in the cache. All to help ensure you don't overload your cache bandwidth wise.

Hope this helps,