Just when you thought installing OpenLiteSpeed couldn’t get any easier, it does! With our new ols1clk installation script, you can install OLS (and optionally MySQL and WordPress) with literally one click!
Since then, an Nginx developer suggested some configuration tuning to make the Nginx results better. We found that some of those configurations are not likely to be used in a production environment. Notably:
Our internal evaluation confirmed that Nginx was indeed faster with its benchmark-tuned configuration. That said, should these kinds of configurations be used in a production server?
LiteMage has many settings which can be fine-tuned to increase store performance. Today, let’s discuss a trick that nearly all stores can benefit from: keeping your public cache warm.
What is a warm cache?
A warm cache already contains your data, in this case objects and pages. When these are freshly stored in your cache, they can be served by LiteSpeed Web Server directly. This prevents PHP from being invoked and hitting the Magento backend, meaning your users can access these objects and pages faster. Thus, it would be ideal to keep your cache warm as long as possible. LiteMage can keep your cache warm indefinitely.
In early January, we blogged about the great feedback that our LiteSpeed Cache Plugin for WordPress (LSCWP) was receiving. Since it’s official release on January 20th, LSCWP has continued to receive very positive feedback. Three months and 2,000 downloads later, we thought we’d share some of this feedback and how it has shaped the development of our LSCWP plugin.
Magento 2 introduced many improvements over Magento 1.9. Magento’s built-in PageCache module is now included in both Magento 2 Community and Enterprise Editions, the checkout process has been streamlined, the code modernized, performance improved, and table locking reduced. These improvements make Magento 2 faster and more stable than it’s predecessor, but it can still be made better.
For example, you could implement a faster full page caching solution, such as Varnish. Unfortunately, this can over-complicate your stack, requiring extra components such as an NGINX reverse proxy and a Varnish Cache Instance.
Now, there’s an easier way.
A few weeks back, Jarrod from rootusers.com posted a benchmark that demonstrated that when handling small static files, Nginx outperformed our OpenLiteSpeed Web Server, particularly during the 1 and 2 CPU Core tests. We decided to dig deeper and investigate these results.