Get Started

Hosting

Get Started

Service
WordPress caching plugins

Why WordPress Cache Plugins Suck and I Never Use Them

You probably know about the importance of page caching for optimizing website performance. Caching plugins, such as W3 Total Cache, WP Super Cache, and WP Rocket, have long been a popular and convenient solution to speed up WordPress websites.

However, PHP-based caching plugins are simply inferior to server-based caching solutions like Nginx’s FastCGI Cache and Varnish, because they are both slower and more resource demanding than their server-based counterparts.

PHP-based caching and why it sucks

PHP-based caching plugins work by storing cached versions of web pages in the WordPress database or on the file system. This approach requires the PHP interpreter to execute the caching plugin’s code on every request, introducing several performance bottlenecks:

  1. PHP Interpreter Overhead: The PHP interpreter must be invoked on every request, incurring a significant overhead in terms of CPU cycles and memory allocation. This overhead is exacerbated by the fact that PHP is an interpreted language, meaning that the code must be parsed and executed on the fly.
  2. Database Queries: When storing cached pages in the WordPress database, PHP-based caching plugins must execute database queries to retrieve and update cache entries. These queries can lead to increased latency, especially under high traffic conditions.
  3. File System I/O: When storing cached pages on the file system, PHP-based caching plugins must perform file system I/O operations, which can be slow and resource-intensive, especially on disk-based storage systems.
  4. Cache Invalidation: PHP-based caching plugins must implement cache invalidation mechanisms to ensure that cached pages are updated when the underlying content changes. This can lead to additional overhead and complexity.

Server-based caching and why it rocks

Server-based caching solutions, such as Nginx’s FastCGI Cache and Varnish, operate at a lower level in the technology stack, caching pages before they even reach the PHP interpreter. This approach offers several advantages:

  1. Native Code Execution: Server-based caching solutions are typically implemented in native code (e.g., C or C++), which executes much faster than interpreted PHP code.
  2. In-Kernel Caching: Server-based caching solutions can leverage in-kernel caching mechanisms, such as the Linux kernel’s page cache, to store cached pages. This approach reduces the overhead associated with file system I/O and database queries.
  3. Asynchronous I/O: Server-based caching solutions can utilize asynchronous I/O operations, allowing them to process multiple requests concurrently and reducing the latency associated with cache retrieval and updates.
  4. Simplified Cache Invalidation: Server-based caching solutions can implement cache invalidation mechanisms that are more efficient and less complex than those used in PHP-based caching plugins.

Server-based caching solutions: FastCGI, Varnish, and Cloudflare

Now, why did I include Cloudflare here? While it’s not a server-based caching solution (it’s cloud-based) it does work similarly as you can avoid using a PHP-based caching plugin and therefore get a much more effective caching solution.

FastCGI cache: a native NGINX cache

FastCGI is a built-in caching module for Nginx, allowing you to cache pages at the server level. Here’s a brief overview of its architecture:

  1. FastCGI Protocol: Nginx uses the FastCGI protocol to communicate with the PHP interpreter, allowing it to cache pages before they are executed by PHP.
  2. Cache Storage: Nginx stores cached pages in a designated cache directory, using a combination of file system and in-kernel caching mechanisms to optimize performance.
  3. Cache Invalidation: Nginx implements a simple and efficient cache invalidation mechanism, using a timestamp-based approach to ensure that cached pages are updated when the underlying content changes.

The best part of FastCGI cache is that it’s a native part of NGINX and already ready to use on many installations. No need to install and manage any 3rd party applications. It is also fairly easy to configure in comparison with Varnish which is highly complex.

Varnish: a high-performance caching proxy

Varnish is a powerful caching proxy that sits in front of your server, caching pages before they even reach the PHP interpreter. Here’s a brief overview of its architecture:

  1. Reverse Proxy Architecture: Varnish operates as a reverse proxy, intercepting incoming requests and caching pages before they are forwarded to the origin server.
  2. Cache Storage: Varnish stores cached pages in a designated cache directory, using a combination of file system and in-kernel caching mechanisms to optimize performance.
  3. Cache Invalidation: Varnish implements a sophisticated cache invalidation mechanism, using a combination of timestamp-based and content-based approaches to ensure that cached pages are updated when the underlying content changes.

Varnish will require you to install and configure an extra application into your server stack. It is fairly difficult to configure but extremely flexible in terms of rules. I prefer to keep my server stack as simple as possible and since it is only slightly faster than FastCGI cache, I prefer this solution over Varnish.

Cloudflare: a cloud-based caching solution

Cloudflare is a cloud-based caching solution that operates as a reverse proxy, caching pages at the edge of the network. Here’s a brief overview of its architecture:

  1. Edge Network: Cloudflare operates a global network of edge servers, caching pages at the edge of the network to reduce latency and improve performance.
  2. Cache Storage: Cloudflare stores cached pages in a distributed cache, using a combination of memory and disk-based storage to optimize performance.
  3. Cache Invalidation: Cloudflare implements a sophisticated cache invalidation mechanism, using a combination of timestamp-based and content-based approaches to ensure that cached pages are updated when the underlying content changes.

I absolutely love Cloudflare for the Edge network caching, security and flexible configuration options. There are even plugins you can use to configure the cache for you, such as Super Page Cache for Cloudflare, but if you know how to set up the cache rules you don’t need them.

Efficiency and performance comparison

To compare the efficiency and performance of FastCGI, Varnish, and Cloudflare, let’s take a look at some key metrics:

  1. Cache Hit Ratio: The cache hit ratio measures the percentage of requests that are served from the cache. A higher cache hit ratio indicates better performance.
  2. Latency: Latency measures the time it takes for a request to be processed and returned to the client. Lower latency indicates better performance.
  3. Throughput: Throughput measures the number of requests that can be processed per second. Higher throughput indicates better performance.

Based on previous test and experience with the different caching solutions, here’s an overview of the approximate performance you can expect from each solution:

SolutionCache Hit RatioLatencyThroughput
FastCGI80-90%10-20ms500-1000 req/s
Varnish90-95%5-10ms1000-2000 req/s
Cloudflare95-99%5-10ms2000-5000 req/s
Performance comparison table

Conclusion

All three caching solutions offer excellent performance and efficiency. However, Cloudflare stands out, with a cache hit ratio of 95-99% and latency as low as 5-10ms. Varnish comes in second, with a cache hit ratio of 90-95% and latency as low as 5-10ms. FastCGI trails behind with a cache hit ratio of 80-90% and latency as low as 10-20ms.

When choosing a server-based caching solution, consider the following factors:

  1. Scalability: If you need to handle high traffic volumes, Cloudflare’s edge network and distributed cache make it an excellent choice.
  2. Ease of use: If you’re looking for a simple, built-in caching solution, FastCGI may be the best option.
  3. Customizability: If you need fine-grained control over your caching configuration, Varnish may be the best choice.

Personally, my clear favorite is Cloudflare, and for a number of reasons.

  1. Edge Network. Because of this, the cache is nearer to the users than any cache you run or your server or website. This means faster loading times for your visitors.
  2. Cache Configuration. Cloudflare offers very granular control in their cache rules. This means you can easily adapt your cache to almost any scenario you need.
  3. CDN. Cloudflare is a cloud-based cache solution (CDN) which means your server load is not affected at all when delivering the cache to users who never reach your server directly, except when the cache needs to refreshed.
  4. Security. Cloudflare’s WAF offers an amazing opportunity to protect your website for hackers and bots that spam, scrape or slow down your website. With Cloudflare, you’re getting performance and security with the same setup.

I have written extensively on the many ways you can use Cloudflare to improve your website’s security as well as preserver your servers resources on my blog.

Are there any other ways of optimizing my website besides using cache plugins?

Sure there are. There are plenty of other things you can do to optimize your WordPress website such as:

  1. Install and configure Redis object cache.
  2. Optimize the WP database.
  3. Defer or delay JavaScript and CSS files.
  4. Make 3rd party requests local.
  5. Disable bloat and unecessary functions.
  6. Unload plugins on pages where they aren’t needed.

One way to disable bloat, defer/delay JS and CSS, unload plugins and make Google fonts and analytics run locally is to use Perfmatters to optimize your site. This does not cover all the optimizations you can do, but it will go a long way to help.

Do you want the fastest possible WordPress website?

If you do, then consider my speed optimization service. You can also choose to let me host and manage your website for you, so you can focus on your business.

Leave a Reply

Your email address will not be published. Required fields are marked *