FastCGI_Cache on nginx reverse proxy, possible?


I am trying to setup a load-balanced small cluster to accommodate growing traffic on my WooCommerce site. The topology is going to be like this:

  1. Load balancer running nginx
  2. 2 application servers (running nginx + PHP-FPM on each)
  3. 1 database server running Percona MySQL

I’ve been playing around with nginx since I found your tutorials. Really good tutorials, by the way. But I also noticed that your tuts are basically for single server setup and not really for clustered approach.

I looked around the Web and doesn’t really find how to use FastCGI_Cache on nginx load balancer. And judging from the config, my guess is that it works mainly on one server. So I have a few questions.

1.) Can I use FastCGI_Cache on load balancer running nginx? Any tuts or links will be appreciated if I can.

2.) If I have no other choice but to use proxy_cache module on load balancer, then what should I do with configurations for WooCommerce (skipping cache on certain dynamic pages, skip cache on filled cart, skip cache on some cookies, etc.)?

If there are any other load balancing setup with caching you would recommend, please let me know too.

Really sorry for delayed reply.

We are updating our docs so new tutorials haven’t been added from sometime.

Not sure if your thread is solved.

Anyway, answers your question:

  1. There are two ways. Front-end fastcgi can directly talk to php-fpm running behind load-balancer. This way backend-servers won’t need Nginx. Just FPM need to listen to public-IP or private-IP depending on networking.

In other case, front-end nginx can use proxy_cache which is same like nginx-cache. We use proxy_cache on front-end which cache backend-end nginx+php server’s response.

  1. You can add same exceptions to proxy_cache and fastcgi_cache. This way proxy will cache only things that are getting cached by fastcgi on backend. Please note that if you use proxy-cache and/or fastcgi-cache, you may need to work more on cache cleaning logic.

Sorry to say we don’t have any tuts or links to share as our load-balancer configuration varies from client-to-client. But we will soon share a template on new documentation site going live before month end. You may subscribe for updates here -

Hi Rahul,

Was wondering if you are planning to put up this tutorial/template. I think it would be very useful.


Hey Vidyut,

I actually managed to do that. I will be putting up a tutorial when time permits.

1 Like

I have been trying to get this working for a long time, in production right now I have 1 nginx reverse proxy which cache’s images jpegs and css, but load balance passes to 3 backend nginx webservers running php-fpm and I am using w3tc with wordpress and am able to get maybe 1800 active users before my cpu’s start to melt on the backend. Because I couldn’t get the fastcgi-cache to work properly I waste lots of cpu on my backend. I see you were able to get this working, if you do have a tutorial it would be greatly appreciated. Thanks Kris

What does your site does?

A customer of mine has 6k users online with a 4GB RAM, 4vCPUs VPS, using REDIS as cache. The server load spikes seldom go beyond 1.0.

Before we chose REDIS we used WP Super Cache and the server could manage 4k users at same time before server load went too high.

What are your needs for such a complicated scenario?

We are a local community portal, I have been testing redis but I can’t get it to not process through my mobile pages, I use wptouch for mobile and it is fed by agent dynamicly same link different css/html/theme if I could get redis to stop caching desktop views of pages for my mobile pages I think redis would help greatly.