Updates vs. version pinning in Docker-based homelab
I'm running a Docker-based homelab that I manage primarily via Portainer, and I'm struggling with how to handle container updates. At first, I had all containers pulling latest, but I thought maybe this was a bad idea as I could end up updating a container without intending to. So, I circled back and pinned every container image in my docker-compose files.
Then I started looking into how to handle updates. I've heard of Watchtower, but I noticed the Linuxserver.io images all recommend not running Watchtower and instead using Diun. In looking into it, I learned it will notify you of updates based on the tag you're tracking for the container, meaning it will never do anything for my containers pinned to a specific version. This made me think maybe I've taken the wrong approach.
What is the best practice here? I want to generally try to keep things up to date, but I don't want to accidentally break things. My biggest fear about tracking latest is that I make some other change in a docker-compose and update the stack which pulls latest for all the container in that stack and breaks some of them with unintended updates. Is this a valid concern, and if so, how can I overcome it?
Anything that needs a specific version for compatibility (Postgres mostly) is pinned to the major release.
Stuff occasionally breaks, but I have backups for that reason (if you don't, set that up now before anything else, they should be running daily at least and you should have 2 types of backup minimum).
Proxmox backs up all my VMs/CTs nightly to the proxmox backup server I run as a VM with an external HDD attached to it. This keeps around 30 versions with a retention policy so I can go back pretty far if needed. These are full bootable images and include everything.
Restic (using Backrest to manage it), runs on any VMs/CTs with critical data, and backs up to Backblaze B2 every night as well, this is a more limited choice of the critical files that I'd need. Similar retention policy as the proxmox backup.
With both I try and do some full restores every month or two and test things out.
For “larger” projects, they tend to follow semantic version best practices fairly well, so I tend to pin to minor (i.e. postgres:16.4) and I get updates along the way, with minimal risk of it breaking from major changes.
For others, I pin to specific version and update on my own terms.
I've never used Portainer, but does it have an option to only notify of available updates?
For things that I don't mind breaking, I use latest. For the services that matter, use a specific version. Take Immich for example, in the 2-3 months I've kept it running, there's been 3 breaking changes that would prevent startup after update without manual intervention. Immich is an extreme though, some other projects have been working fine with latest without touching them for years.
I follow the important projects' releases (subacribe if possible), and update manually when they publish an image with a new version. I'd see it as either updating manually and being OK about possibly being a version behind every now and then, or using latest+auto updates and being OK with waking up to broken services every now and then. Which might never happen.
@RadDevon While using latest in a production environment is not considered a good idea, I've been using Watchtower in my homelab for years to keep running images up to date without any issue.
Some apps also provide major version tags (e.g. Postgres), so you avoid breaking changes (as long as they adhere to semver).
@RadDevon You can also use tools like Renovate or Dependabot to create a pull request once an image in your docker-compose files is updated (runs on GitHub, GitLab, Gitea, Forgejo, etc.)
That leaves you with running tests in your CI pipeline and setting up a deployment step afterwards.