I know this isn’t really directly related to Wordpress and/or Nginx but I have noticed even with FastCGI_Cache or W3TC running a server on a cheap host can be taken down with nothing more than random page requests that return a 404 or worse simply adding a random query string to the end of the URL. Is there any fix or recommendations on how to deal with this?
Navigate to http://example.com/hello-world/ which is a valid page for most wordpress sites, with a caching plugin/module in place it would be cashed and even a million requests after would not tax the system at all.
Navigate to http://example.com/some_random_page35345/ and you will get a 404, which will be caused after the first request but only for that page. Additional requests to other pages that don’t exist will force the php service to trigger and the caching module to have to work again. Running a script that makes requests to random pages will cause the system to eventually lock-up either the Swap Daemon or the PHP Daemon.
Navigating to http://example.com/?1=1 won’t navigate to a 404 error page but it will bypass the caching module and force the system to give you a fresh version of the page. Like above on a small server, like that which you might use on Digital Ocean, this would be disastrous at it doesn’t even require a script but instead to simple refresh a browser or send many CURL requests to the site with anything in the query string. Again causing either the Swap or PHP Daemon to lockup or even crash entirely.
I am sure more than a few of you have come across this in your testing and I am guessing someone has a fix but I wanted to be as informative as I could be in my description of the problem.