Performance tips?

I'm running WP and nginx on a very slow machine, namely BeagleBoneBlack. It is an ARM version with 500 MB mem and 1GHZ. Due to the restriction I'm aware that I cannot expect THAT much from performance. On the other hand I'm curious what is possible resp. how to tune it to see what is possible. Therefore some of your performance tips don't really apply cause the BBB is quite limited.

I followed the tips for W3TC as well as for the fastcgi but the results are not very much changing. To give an idea what I mean: When I run ab -n 100 -c 10 I get as an average around 2 sec (which is just a static page with php_info(). When I run the same test with a simple text-site from WP as homepage (ab -n 100 -c 10 I get around 20 secs. But I get almost the same results without the mentioned optimizations, so there is not much change. not much = since there is always a variation in the results I don't count 5% deviation cause this might be related to CPU-govenor-issues.

I would have expected something like 10 sec with the benchmark, especially since its every time the same page and the would be cached anyhow. Are my expectations wrong? I mean, I thought static page-serving should be similar to the function of W3TC. Or did I miss something in the config? (e.g. PHP5-FPM/CGI-config, MySQL?)

P.S.: I don't expect to have a good running web-server with perfect response times. But 20secs is little bit too much of OUCH..

First, can you verify if fastcgi-cache is working on your end?

Also, for testing purpose, you can create a 512MB RAM VPS powered by SSD on DigitalOcean. They charge less than 1 cent/hour and give $10 worth-credit for testing for all new accounts.

In any case, verifying if page is cached properly is most important thing.

First of all I see what you mean reg. checking HTTP-header, but had my problems to figure out how to retrieve it via Win7. Telnet didn't work cause there was no real reply so I switch to the Linux-system. I did the following with wget (Suggestion: Perhaps an update how to check would helpful ;) )

But from the results I have no real clue if now the caching is working or not. In one line it says HIT (at the very ending), but for the other BYPASS. Since I can repeat them within seconds and the result remains the same, I don't really understand...

And to my understanding the request are both the same and caching should be working for more then 2 secs. Or do I misunderstand something?

I already explained in to use curl command.

If curl is not present on your Ubuntu/Linux box, you can install it using apt-get install curl

Sorry but I won't be able to support any kind of Windows.

Also, a BYPASS will remain as it is for subsequent requests.

If a URL is cacheable, it will return MISS for very first request and when cache is expired but other times, it will return HIT.

Uhu, thx for clarification. I didn't look on the Checklist-site how to retrieve the header-info, more searching around in your very first provided link reg. Check if Cache is working. Sorry for confusion.

Since I got the HEADER already I don't need any additional support for this issue.

Thx as well for clarification what BYPASS resp. MISS and HIT.

The only question left is reg. cache-expire. According to the logs the header expires after 10 min. On the one hand I don't see a specific definition about this value in the config-file, therefore I assume it is taken from some default values. On the other hand I see some discussion in your blog about expire but have no real clue for which elements this might be useful (I guess for the curl... with the index.php this should apply, but I'm not real clear about this.). Did you already post some conclusions? If so, would you mind to give a link? If not, would you mind to give a brief explanation about your expierences reg. the expire settings?

You can add expires header anywhere i.e. server block for sitewide, a location block for a particular section.

More details -

Sample code are like below:

# Cache static files for as long as possible  
location ~* .(ogg|ogv|svg|svgz|eot|otf|woff|mp4|ttf|css|rss|atom|js|jpg|jpeg|gif|png|ico|zip|tgz|gz|rar|bz2|doc|xls|exe|ppt|tar|mid|midi|wav|bmp|rtf)$ {  
    expires max; log_not_found off; access_log off;  

Taken from

Make sure you remove css/js from list of extension if your site is generating css/js dynamically (or in short if site looks broken).

Above code-block apart from setting from expires to max possible value, also disable logging for static content requests which saves disk I/O as well as CPU cycles.

Thx for information and your support. It helped a lot.

Glad to know. Have a nice weekend. :slight_smile: