Self-Hosted Headaches with Caddy, Dynmap, and Docker!

As some of you may know, I love Minecraft!

I run a Forge-modded server for some friends and use a mod called Dynmap (which is also a popular plugin by the same name). This mod allows players to view a map of the entire Minecraft world in their browser and see the locations of other players.

This had been working out beautifully until our world started to grow rather large in file size. You see, Dynmap renders/generates map tiles based on chunks in your world and the larger your world is, the more map tiles there are. For some context, my world is about 9GB and the Dynmap files that make up the map come in around ~43GB.

That’s a lot of space! And I knew that was going to be an issue as our world grew.

I manage my Minecraft server using AMP and it lives within an Ubuntu virtual machine dedicated to Minecraft, which lives on a node in a Proxmox cluster.

I could have just assigned more space to the VM and called it a day (which is what I did as a short-term solution), BUT I have a NAS with lots of space hosted on another node within the same Proxmox cluster, on the same local network, and I began to wonder if there was a way to offload the storage component of Dynmap from the Minecraft server to the NAS.

As it turns out, there is!

A user named haloflooder posted an in-depth tutorial on Dynmaps’ Github going over how to setup a MySQL database and standalone web server.

A couple of notes on using MySQL as your storage medium for Dynmap:

  1. You HAVE to do a fresh full render of your world after setting MySQL (or any database alternative) up. You can’t migrate any existing renders.
  2. As pointed out by user database64128 in a Dynmap Github issues thread, using MySQL for map storage may not be the most resource efficient way of storing map data:

I would recommend against using MySQL for map storage, unless you have dedicated hardware acting as the MySQL server. I have 2 Forge servers running on my home server. The one configured with MySQL has significantly more CPU, RAM and disk I/O overhead due to transactions with the database. In fact, MySQL was not designed to store BLOBs, which make up almost all of dynmap’s map storage. By using MySQL as map storage, extra CPU cycles are consumed by SQL transactions. And because dynmap stores map data as image BLOBs, using MySQL actually results in way worse performance than file storage, as observed on my home server. See also https://stackoverflow.com/questions/5285857/when-is-using-mysql-blob-recommended

database64128

That said, I’ll still be going the MySQL route, as it seems to be the easiest way to link a remote database up to Dynmap (couldn’t figure out how to do this with the default storage method) and this is all an experiment for me, therefore I’m not as worried about performance as much as actually getting this work (lol).

Your mileage may vary. Just wanted to point that out.

And, because I’m a glutton for punishment (or Docker evangelist, take your pick), I decided to try to get everything working in Docker.   

As you might know from my previous posts, I’m a bit of a Docker fan.

I couldn’t find any useful information or examples pertaining to existing Dynmap “stacks” (even haloflooder’s tutorial is about installing everything directly on a server).  

So I took a shot at throwing a Docker Compose file together!

---
version: "3.8"
services:
  mysql:
    container_name: dynmap-mysql
    image: mysql:8
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: PASSWORD #<---change this
      MYSQL_DATABASE: dynmap 
      MYSQL_USER: dynmap
      MYSQL_PASSWORD: PASSWORD #<---change this
    ports:
      - 3306:3306
    volumes:
      - db:/var/lib/mysql
      - initdb:/docker-entrypoint-initdb.d
      - db_log:/var/log/mysql

volumes:
  db:
    driver_opts:
      type: cifs
      o: 'username=USERNAME,password=PASSWORD' #<---change this
      device: //PATH_TO_DB #<---change this
  db_log:
    driver_opts:
      type: cifs
      o: 'username=USERNAME,password=PASSWORD' #<---change this
      device: //PATH_TO_DB #<---change this
  initdb:
    driver_opts:
      type: cifs
      o: 'username=USERNAME,password=PASSWORD' #<---change this
      device: //PATH_TO_DB #<---change this

One thing to note is that Forge does not come with a MySQL connector. There is, however, the MySQL JDBC mod (for both Forge and Fabric) that will allow you to hook into your database. That’s what I’ll be using.

With the MySQL database online, I started modifying the configuration.txt file that resides within the Dynmap root folder (server_folder > dynmap), commenting out the default storage type (which is filetree), and uncommenting/filling out the MySQL info:

storage:
  # Filetree storage (standard tree of image files for maps)
  #type: filetree
  # SQLite db for map storage (uses dbfile as storage location)
  #type: sqlite
  #dbfile: dynmap.db
  # MySQL DB for map storage (at 'hostname':'port' with flags "flags" in database 'database' using user 'userid' password 'password' and table prefix 'prefix')
  type: mysql
  hostname: CHANGE_THIS
  port: CHANGE_THIS
  database: CHANGE_THIS
  userid: CHANGE_THIS
  password: CHANGE_THIS
  prefix: ""
  flags: "?allowReconnect=true&autoReconnect=true"

Going back to the tutorial, I noticed that haloflooder used Nginx as the web server (and to reverse-proxy the map). I don’t have much experience with Nginx, but I do already have Caddy setup doing all my reverse-proxying, so I thought this would be the perfect time to try out its web server functionality!

I was able to get the Caddy web server spun up without too much fuss. While, for the most part, Caddy has some wonderful documentation – the stuff on Docker is lacking and it took some fumbling to get everything right.

Here’s what I ended up with in my Compose file:

---
version: "3.8"
  caddy:
    container_name: caddy
    image: caddy:2.7
    restart: unless-stopped
    ports:
      - 7001:80
      - 7002:443
      - 7003:2015
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - /PATH_TO_CONFIG/Caddyfile:/etc/caddy/Caddyfile
      - /PATH_TO_CONFIG/data:/data
      - /PATH_TO_CONFIG/config:/config
      - /PATH_TO_CONFIG/site:/srv

I was able to reach the default Caddy page on port 7003, so I moved onto the next step.

The tutorial has you change some additional settings within the configuration.txt file, then bring the server up and shut it down to load your config changes. Next, you copy the contents of your dynmap/web directory to whatever directory you’re using to hose the external web server, in my case that was the /PATH_TO_CONFIG/site directory.

Now, this is where everything started to fall apart…

When trying to hit the web server on port 7003 I ran into a black screen.

Some things were loading, since the page title and favicon loaded correctly. Just no contents. 🤔

Despite following fixes that other users had posted after encountering a similar issue, I wasn’t able to get it working. I thought maybe it was because I was missing the php part of the mix (which I was under the impression was only necessary for the web chat component), but after adding php via Compose and the Caddyfile I still couldn’t shake the black screen.

At this point, I was ready to throw in the towel.

As nice as it would have been to get the external web server part of this working (which would allow Dynmap’s map to be accessible even when the server was asleep/offline), it was becoming increasingly frustrating and I decided that I had already achieved my main objective of shifting the storage burden off of the Minecraft server.

So I scrapped the external web server idea and reverted some of the changes I made to the configfuration.txt file. Then I verified that Dynmap’s internal web server was working again, and double-checked all of the info for the MySQL database.

As I said in the beginning of this post, my world is on the larger side and a fresh render is required after setting MySQL as the database. So from here I ran a full render from the Minecraft server’s console and let things bake overnight…

Ta-da!

Once the render was done and I verified that everything looked good, I setup the Caddyfile for reverse proxy, which is dead easy to do:

{
	email youremailhere@email.com
}

map.domain.com {
	reverse_proxy IP.ADDRESS:PORT
}

Annnnd when everything was said and done this is basically what my Docker Compose file looked like:

---
version: "3.8"
services:

  caddy:
    container_name: caddy
    image: caddy:latest
    restart: unless-stopped
    ports:
      - 80:80
      - 443:443
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - /PATH_TO_DIRECTORY/Caddyfile:/etc/caddy/Caddyfile #<---change this
      - /PATH_TO_DIRECTORY/data:/data #<---change this
      - /PATH_TO_DIRECTORY/config:/config #<---change this
   
  mysql:
    container_name: dynmap-mysql
    image: mysql:8
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: PASSWORD #<---change this
      MYSQL_DATABASE: dynmap
      MYSQL_USER: dynmap
      MYSQL_PASSWORD: PASSWORD #<---change this
    ports:
      - 3306:3306
    volumes:
      - db:/var/lib/mysql
      - initdb:/docker-entrypoint-initdb.d
      - db_log:/var/log/mysql

volumes:
  db:
    driver_opts:
      type: cifs
      o: 'username=USERNAME,password=PASSWORD' #<---change this
      device: //PATH_TO_DB #<---change this
  db_log:
    driver_opts:
      type: cifs
      o: 'username=USERNAME,password=PASSWORD' #<---change this
      device: //PATH_TO_DB #<---change this
  initdb:
    driver_opts:
      type: cifs
      o: 'username=USERNAME,password=PASSWORD' #<---change this
      device: //PATH_TO_DB #<---change this

As always, I’d love to hear your thoughts! Are you using Dynmap? If so, how do you have it deployed? Always interested to see how people how their stacks setup!

0 0 votes
Article Rating
Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
John
6 months ago

Hi Trevor, any updates on your experience with mySQL and Dynmap?