After 4 years with nginx, we switched to Caddy - Here is why

When we started Hashnode in 2015, we wanted to keep things really simple. Since nginx was the most popular choice, we decided to use it as a reverse proxy to our Node.js backend. It was a simple set up and worked really well for us for years.

With Devblog initiative, our requirements changed in mid-2019. In case you aren't aware, Devblog lets you create a programming blog instantly and host it on your own domain while still being a part of the Hashnode community. However, the biggest challenge was serving those domains over HTTPS.

We started looking for a web server that generates SSL certs on demand (preferably using LetsEncrypt) and maintains them. None of the existing CDN providers fit into our criteria. However, one particular setup worked for us. It was OpenResty + Lua. I wrote the following articles to demonstrate how we generate SSL certs for custom domains using OpenResty.

OpenResty uses nginx at its core and lets you add dynamic logic using Lua.

In March 2019, Hashnode transitioned from being just a community of programmers to a community of independent thinkers and educators who have the liberty of building their own authority using a domain of their choice. OpenResty with lua-resty-auto-ssl plugin helped us generate SSL certs for arbitrary custom domains for free. We launched the platform in public beta and this was a big win for us.

Fast forward 3 months -- As a part of the public beta program, we unlocked personal blogs of nearly 2000 users. Around 100 of them decided to use a custom domain. This is where the actual problem started.

  • The SSL handshake started degrading after 50 custom domains. Sometimes it took about ~1.5s to complete SSL negotiation. I tried a bunch of things to fix this e.g. configured nginx to use multiple CPU cores, cached sessions, used fast ciphers, dhparams etc. But none of that fixed the problem. Moreover, I had allocated more than enough RAM to cache the certs in memory.

  • OpenResty started crashing once in a while when the SSL certs couldn't be generated. I raised an issue on GitHub, but didn't see any activity on my thread.

So, our initial promise of delivering a PageSpeed score of 100 to every blog was affected. The score was at ~95 -- this was unacceptable. I wanted every blog on Hashnode to be really fast and get a score of 100.

Enter Caddy. I briefly examined Caddy in early 2019 but at that time it didn't support on-demand TLS in a cluster mode. Since we wanted to build a CDN, it didn't fit the bill at that time. But I had a chance to re-evaluate Caddy this month and I was blown away by the progress.

Caddy v1.0.0 gives you a lot of things out of the box. Some of the highlights are:

  • Automatic HTTPS with http2
  • On demand TLS works with multiple nodes (cluster mode) with no extra config
  • Super simple configuration and easy to get started

When I rediscovered Caddy and learned about the progress, I decided to try it out. So, after testing it out for a couple of weeks, I decided to switch to Caddy completely.

We are testing out our CDN with 5 nodes at the following locations:

  • Singapore
  • Bangalore
  • Frankfurt
  • San Francisco
  • New York

So, I had to install Caddy on each node. Our Caddyfile (the equivalent of nginx.conf) looks like the following:

* {
    proxy / localhost:3000 {
    tls {

* {
    proxy / localhost:3000 {
    tls ./ssl/fullchain.pem ./ssl/privkey.pem

It's very easy to read/understand and much simpler than an nginx config file.

Since all of the nodes need access to the SSL certs, I created a separate VM for storing the certs and mounted it on each of these 5 machines as a shared disk. Caddy handles the rest which involves generating certs by hitting LE's APIs, caching them in memory for faster access, coordinating with other VMs for renewal and so on.

The result:

All in all we achieved our goals and now use a web server that's easy to configure and automatically supports on-demand TLS out of the box!

Here is a report from WebPageTest that sheds some light on the benchmark part. The bottomline is that it's really fast and even with a hundred domains served from a single machine the SSL negotiation is still optimal.

I am really impressed with Caddy. If you are aiming to create a multi-tenant app (aka SaaS product), you should totally check out Caddy. OpenResty based setup works for some people, but in my opinion Caddy is a much better choice. Plus there is team which is actively developing and maintaining the software -- which means if you need professional support you can actually buy one of the support packages and take their help.

Even if you are not building a SaaS app, Caddy is still a great choice. The fact that it's HTTPS out of the box is itself a big win. Additionally, the config is super easy and the web server is as fast as nginx -- so there is no reason you shouldn't use it for your next project.

I should also mention that Caddy is free for commercial use if you are building from source. If you use their download page and intend to use it for a commercial project, you need to buy a license. However, building from source is pretty easy and you don't need to be an expert in Go -- you can learn more about it on their GitHub page.

Let me know what you think in comments below. Don't have a devblog yet? Unlock yours here -- it's free.

Michaela Greiler's photo

Very interesting insights. Thanks for sharing, I definitely learned something new!

Serdar Sanri's photo

This is really interesting Sandeep. I am working on a side project kinda have same concept ( multi tenants have their own domain under same application) and this issue has been in my mind once it is in production. Locally I use docker with nginx-proxy lets encrypt companion container and been PIA for some time. I will definitely keep this post in my favs.

alin debian's photo

Very usefull story. Thanks for it!

Gijo Varghese's photo

Awesome article!

So caddy server acts a a cache layer that caches html pages, right? How do you clear caddy server cache if a particular page is updated?

Sandeep Panda's photo

Sorry about the delay. I took a short break and didn't have a chance to go through the comments.

When any post is updated, we send a message to each of the edge nodes via rabbitmq and simply delete the .html file from the disk. That takes care of the cache purging.

Gijo Varghese's photo

Sandeep Panda so it clear entire HTML files? Edit in your blog post shouldn't clear cache from other user's right?

Yashu Mittal's photo

Caddy team is working on v2, so are you guys planning to migrate to v2

Once done, please share your thoughts and process for the same.

Raghavan alias Saravanan Muthu's photo

Hey Sandeep, looks very interesting on the stats. Have been obsessed with Apache and kinda hearing more about Nginx replacing Apache. now yet another buddy in this list - Caddy!

Thanks for sharing your experiences along with the features and benefits of Caddy Server. Sure, will check this out.