Nextcloud seems to have a bad reputation around here regarding performance. It never really bothered me, but when a comment on a post here yesterday talked about huge speed gains to be had with Postgres, I got curious and spent a few hours researching and tweaking my setup.
I thought I'd write up what I learned and maybe others can jump in with their insights to make this a good general overview.
To note, my installation initially started out with this docker compose stack from the official nextcloud docker images (as opposed to the AIO image or a source installation.) I run this behind an NGINX reverse proxy.
Sources of information
Server tuning on Nextcloud Docs: Most of this are very basic things that are already taken care of in the docker image or in the proxy companion image I'm using. The one thing I haven't tried and that comes up in other places, too, is using Imaginary for image preview generation.
Eking out some Nextcloud Performance mainly talks about using a socket connection for redis, but also mentions logging to syslog (have not found a good source of information for this), using postgres, using imaginary for image previews
Improvements
Migrate DB to Postgres
What I did first is migrate from maridb to postgres, roughly following the blog post I linked above. I didn't do any benchmarking, but page loads felt a little faster after that (but a far cry from the "way way faster" claims I'd read.)
Here's my process
add postgres container to compose file like so. I named mine "postgres", added a "postgres" volume, and added it to depends_on for app and cron
run migration command from nextcloud app container like any other occ command. The migration process stopped with an error for a deactivated app so I completely removed it, dropped the postgres tables and started migration again and it went through. after migration, check admin settings/system to make sure Nextcloud is now using postgres. ./occ db:convert-type --password $POSTGRES_PASSWORD --all-apps pgsql $POSTGRES_USER postgres $POSTGRES_DB
remove old "db" container and volume and all references to it from compose file and run docker compose up -d --remove-orphans
Redis over Sockets
I followed above guide for connecting to Redis with sockets with details as stated below. This improved performance quite significantly. Very fast loads for files, calendar, etc. I haven't yet changed the postgres connection over to sockets since the article spoke about minor improvements, but I might try this next.
Hints
the redis configuration (host, port, password, ...) need to be set in config/config.php, as well as config/redis.config.php
the cron container needs to receive the same /etc/localtime and /etc/timezone volumes the app container did, as well as the volumes_from: tmp
EDIT Postgres over Sockets
I'm now connecting to Postgres over sockets as well, which gave another pretty significant speed bump. When looking at developer tools in Firefox, the dashboard now finishes loading in half the time it did before the change; just over 6s. I followed the same blog article I did for Redis.
Steps
in the compose file, for the db container: add volumes /etc/localtime and /etc/timezone; add user: "70:33"; add command: postgres -c unix_socket_directories='/var/run/postgresql/,/tmp/docker/'; add tmp container to volumes_from and depends_on
in nextcloud config.php, replace 'dbhost' => 'postgres', with 'dbhost' => '/tmp/docker/',
Outlook
What have you done to improve your instance's performance? Do you know good articles to share? I'm happy to edit this post to include any insights and make this a good source of information regarding Nextcloud performance.
I had been running Nextcloud on an old laptop using Ubuntu, but that machine died. I have a Windows PC originally built for gaming that I am considering using for Nextcloud. Anyone have any experience with NC and Windows? Thought on the DB switch on Windows?
100% agree with tofubi, Docker on Windows is a form of self-abuse, like cutting yourself. It's a train wreck for anything other than a little bit of testing for development work. You will come away with a bad taste in your mouth about Docker, I avoided containers for years because I started with them on Windows docker.
I've run a lot of different scenarios with docker, what I've come down to as the cleanest and easiest to maintain is Debian 12 with the Docker convenience script. It's fast, hassle free, and doesn't have a bunch of layers of weirdness like using Ubuntu Server with a docker snap that makes troubleshooting a nightmare.
for anything other than a little bit of testing for development work.
It's really awesome for development work, though. Visual Studio has built-in Docker support, so I can run my app and its unit tests on both Windows and Linux (via Docker) at the same time on the same system during development.
I use docker in vscode for latex. It saves me the trouble of having to install texlive on my system. I have a task defined that mounts my sources in and runs the compilation in the container.