Why switch from Google Analytics to Matomo?
For the privacy of your website visitors. That’s it. And there is a chance that the GDPR will make it harder to use Analytics after a ruling in Austria. Even the Dutch authority has updated its guidelines. I manage Matomo for a couple of really big sites and it works really well. You can disable cookies which will give you less information but also removes the need for the annoying cookie popup. Or you can go the other way and install plugins that add more functionality.
You can use one Matomo instance for multiple websites and applications. I recommend to keep it separate from the website or application server itself for better security and performance. A single VPS should be enough. Hetzner has some nice deals. They have been around for a very long time so this is not some small business that will disappear next year. I don’t want to spent any more time on this than absolutely necessary. Certainly not moving the whole thing somewhere else. Even their largest servers are affordable which may come in handy when your Matomo gets a lot of traffic or your marketing team wants to run complicated analytics on it. They also have backups in the form of snapshots and and external firewall.
I suggest you start with the smallest VPS and choose the latest Ubuntu LTS release. Then use the web interface to configure a firewall that only allows traffic over ports 22, 80 and 443. This is important because containers can open ports and override iptables firewalls. And I don’t want to risk leaking data because something started a dashboard or some service that isn’t secure.
Login over SSH and install some prerequisites:
Enable automatic updates because this is just basic Ubuntu + Docker and the chance of automatic updates breaking something is really small.
I have a sample config for automatic upgrades that updates everything, automatically removes old packages and reboots when necessary. You can get it here or download it to the right location with:
You can use named volumes in the docker-compose file but I prefer bind mounts to a local directory. Volumes hide the files somewhere in the Docker directory and with bind mounts you can see the files, check disk usage and make backups. So we need to manually create some directories:
And a docker-compose file, you can download it with this line:
or copy and paste it:
Create an .env file in /opt/matomo:
Replace the passwords with your own, you can use this command to generate new passwords.
DOMAIN is the url you want to access Matomo on. Make sure you add the dns record beforehand because Caddy will immediately request a certiticate. I usually use some short subdomain. For example a.runrails.com.
Start the whole thing in the foreground with:
I usually check the output for errors and if everything works I stop it and start it in the background with:
You should now be able to access Matomo on the url you specified in the DOMAIN variable. It will ask some questions like the database username and password. Use the same ones as in the ENV file and you should be good to go.
Every application you run should also have at least one monitoring check. Especially the ones that have automatic server updates enabled like we did here. https://updown.io is my recommended tool for this. I have mine set it to check for the text “Username” on the page so I know the page is not displaying some error. Another benefit is that it notifies you if the certificate is about to expire. Caddy should automatically renew this, but again, it is good to keep an eye on it.