Adventures in Mastodon Self-Hosting: The Story So Far

A couple months ago a bunch of my most active twitter followers were migrating to the federated open-source social media platform Mastodon. Instead of a single centralized service, its a wide collection of independent instances that are able to “federate” together. Some have compared it to email: You can pick any host (instance) you want, and mostly still communicate with anyone on different hosts. Theres nothing stopping gmail users from sending things to hotmail users, for example.

I found this very interesting and naturally wanted to run my own instance to check it out: thus was born social.mattburkedev.com.

Steps so far

Acquired a small droplet from digital ocean

This was pretty straight forward. This is my first time using digital ocean, and I’ve been impressed with its admin portal. Its really clear and has useful monitoring tools baked in.

Followed the tutorials on the mastodon website for initial setup

I skipped the elastic search setup because I don’t think I’d use it much.

I hit some issues with this. First I needed to have node 16 to be compatible with the build tooling. Unfortunately the droplet came preconfigured with apt sources that had node 18. I had to manually install the node source for 16, force install from that source, and then “hold” it so that it wouldnt get auto upgraded.

I can’t remember the exact steps, but this seems close based on my history:

1
2
3
4
5
6
7
8
# install the node16 source list
curl -fsSL https://deb.nodesource.com/setup_16.x | sudo -E bash -

# force install at v16
apt-get install nodejs=16.18.1-deb-1nodesource1

# add a hold so that nodejs doesnt get auto upgraded
apt-mark hold nodejs

The next thing I ran into was running out of memory during the asset precompilation. This is a webpack build that compiles all the CSS and javascript. It needed more memory than 1GB. I took two steps, the combination of which seemed to work:

  1. Disabled webpack parallelization and devtool. In config/webpack/production.js I set devtool to none and parallel to false. Parallization runs multiple processes, each eating up more memory.
  2. Added a 4GB swapfile. This turned out to be needed for the normal operations of the site as well, so it was a good step.

Swapfile steps:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
# allocate a 4GB file to be the swap area
fallocate -l 4G /swapfile

# Make it only available to root
chmod 600 /swapfile

# Turn it into a swapfiel
sudo mkswap /swapfile

# Turn on swap
sudo swapon /swapfile

# Add this to /etc/fstab so that it re-enables next boot
# /swapfile none swap defaults 0 0

https://linuxize.com/post/create-a-linux-swap-file/

At this point it was up and running.

Disk Space

Mastodon can use a lot of disk space. Each user can have a profile picture and a banner picture (called a header). A post may include image attachments, or may link to an article that has an image. These images are downloaded and cached on the server’s disk.

My droplet only has 25GB of space, and that was quickly used up.

As of today, it looks like this:

1
2
3
4
5
6
7
8
9
mastodon@mastodon:~/live$ bundle exec bin/tootctl media usage
Attachments: 349 MB (1.57 MB local)
Custom emoji: 49 MB (0 Bytes local)
Preview cards: 10.3 MB
Avatars: 2.15 GB (28.9 KB local)
Headers: 4.55 GB (0 Bytes local)
Backups: 0 Bytes
Imports: 0 Bytes
Settings: 0 Bytes

I’ve got mastodon configured to purge media older than 3 days. So profile pictures, attachments, or preview cards for posts that havent been seen in more than 3 days get deleted. If I scroll my feed back far enough, I’d start getting image load issues. I understand that they are reloaded on demand, but havent really tested that.

The local files are those created by user’s who live on my instance, rather than those that live on others and are federated in via following or boosting. So its just me: 1.57MB of my own uploads, and my 28.9KB profile picture. The rest are from people I follow, or people my follows boosted.

In Mastodon 4.0.2, old header images are not purged the same way attachments, avatars, etc are. That is being fixed in Mastodon 4.1.0 which I hope to install soon. See https://github.com/mastodon/mastodon/issues/9567

Backups

After about a month of running I decided to get a little more serious and actually back stuff up.

There’s a guide on the website: https://docs.joinmastodon.org/admin/backups/

For my needs, the main things to backup are just the configuration files and the postgres database. I don’t really care if media gets lost in a disaster, or even losing my post history. However since user encryption keys used for signing posts are in the postgres database, losing that would break all my follows.

I copied the configuration file to 1Password

I basically followed this guide to setup postgres backups to S3: https://zaiste.net/posts/backup-postgresql-to-amazon-s3/

That brings us to today. Next steps I want to try:

  1. Upgrading to 4.1 release candidate to get access to purging old header images
  2. Move media to S3 instead of local disk