I switched to Caddy recently on one of our servers to host multiple applications and am super happy about its simplicity and ease of use. Here’s how I use Caddy with Quasar.
What is Caddy?
Caddy is a web server much like Nginx. It calls itself “a new kind of extensible platform for server apps”. Features include -
- Easy configuration (super readable config. files - we will get to this in a bit)
- Configuration is exposed as APIs (JSON files can be used to create/change configuration)
- An extensible plugin system
- Automatic https using Let’s Encrypt
Caddy does all this with a single executable, and a single configuration file.
Caddy version 2 caused some ground-breaking changes from v1. Since I did not quite use v1, can’t really compare how they fare. Caddy is free to use and is licensed under Apache 2.0 open source license.
Why use Caddy?
The most important factor for me - Caddy enables quick setup of a server with multiple apps without spending hours on configuration.
As a non sysadmin developer, I approach server configuration with some skepticism. I cannot quite get away from it due to cheaper costs, but I would rather spend time creating something else rather than fiddling with server configuration.
Nginx has been a great companion. The configuration files are just ok and I don’t mind them as long as they are setup only once. However, introducing multiple apps, APIs and static file hosting on same server confuses me and requires quite a bit of trial & error configuration.
Over years I have tried solving the problem with a few options -
- Use cookie cutter config files with Nginx
- Nginx configuration file generator
- Alternate server options (there are many options but one needs to be careful about a solution that is too niche or has a thin community behind it)
- Self-hosting solutions like Parse that let’s me get away with backend configuration changes
It is with that background that I find Caddy a refreshing change. With minimal configuration, easier changes and acceptable performance, Caddy works great for this use case.
My Typical Use Case
My preferred architecture when using “quick” Vue apps -
- A node server that serves APIs and database on a 2-4 GB server
- Front-end hosted on same server. VueJS is my framework of choice
- Static content on Github, Netlify or similar
Each server typically hosts 1-4 production apps or x
test/MVP apps.
The Caddy server should -
- handle API requests and proxy them to Node server. The requests may be from the frontend application or from external systems
- handle frontend requests when routing to different pages
Caddy Configuration
While I found Caddy documentation to be great in general, the examples seem to be for a more “in the know” audience. The radical changes from v1 to v2 have made some of the previous content irrelevant.
I typically use Caddyfile
to store Caddy configuration, which does not appear to be future-proof but is fully supported and quite easy to use. The structure looks somewhat like this -
example.com {
root * /usr/www/example
file_server
}
You can store this file wherever you please, but I use /etc/caddy/Caddyfile
.
I can now start Caddy with -
|
|
I will switch to configuration using JSON at some point in time with a typical structure looking like this -
|
|
The JSON format is not quite difficult to understand, but more “typing”. I don’t send API requests to configure web servers (thank heavens), and the specific port remains closed in the firewall.
Caddy and Quasar
While I knew Caddy’s configuration was quite simple, I could not get it to play nicely with client-side routing of Vue apps at the beginning.
After some hit & miss, and lot of searching on Google, I have this configuration that simply works.
# Caddyfile
example.com {
root * /usr/www/example
route {
reverse_proxy /api/* localhost:3000
try_files {path} {path}/ /index.html
file_server
}
}
another-example.net {
root * /usr/www/anotherexample
file_server
}
This file accomplishes the following -
- Host site
example.com
- redirect all requests with
/api/*
tolocalhost:3000
. The node server listens on port 3000, and that port is not open to the Internet - redirect any request to
/
and allow Vue to handle routing. I do this usingtry_files {path} {path}/ /index.html
, which tries locating for a file with the name passed from URL and redirects to it. If no file is found, it directs toindex.html
- the
route
statement ensures that Caddy follows the given order while handling URLs. APIs with/api/
are served by Node server and other URLs are served by client-side routing
- redirect all requests with
- Host site
another-example.net
- serve files in
/usr/www/anotherexample
- serve files in
All I now need to is point the domain name to the IP address of this server and both example.com
and another-example.net
just work!
When not to use Caddy?
Caddy works great for all my use cases. I don’t quite build Twitter or Facebook level apps - so I should be ok I guess.
At the same time, I will not quite migrate everything to Caddy at this time. If you have configured Nginx a couple of times, you know that all it takes is some “lazy copy/paste” to get the whole thing working. Simpler needs = simple tasks = no need to take drastic action to change status quo.
You may also not want to use Caddy if you -
- manage quite an intensive server that handles high traffic volumes. Nginx can manage twice the processing
- want to consume as less memory as possible. Caddy is light-weight but not as much as Nginx, and can consume more than double the memory
See -
Finis
I personally believe in a great future for Caddy. It has a great community, good documentation and is a delight to use.
Start using Caddy for your dev/test environments and see the difference. Migrating production environments may be a long drawn out decision based on your intended use.