←back to thread

264 points davidgomes | 2 comments | | HN request time: 0.513s | source
Show context
Netcob ◴[] No.41877849[source]
My personal reason: While I haven't had to deal with a Postgres update at work yet, I've been running a pretty large Postgres 12 database in my homelab for a few years now.

My homelab projects mostly center around a "everything is an MQTT message" idea. Zigbee sensors, Tasmota power readings, OwnTracks locations, surveillance camera events, motion sensors for light switches, currently active app on my PC, status of my 3D printer, whatever my vacuum robots are up to and so on. It all gets recorded into a Postgres db. From there I can use it for data mining experiments, but mostly as a source for Grafana. I tried counting the rows but that query didn't even complete while I was writing this comment.

I like trying out all kinds of dockerized oss services, and I keep them updated using watchtower. I run a gitlab instance which is usually the most annoying service to update because it there's an upgrade path and post-start-migrations. With my Postgres instance, which is isolated from the internet, I'll have to figure out what the fastest way is to move all that data around, not leave a huge gap in the record and so on. Sounds like at least a day of work - and since it's technically all for "fun", it'll have to wait until it actually is that.

replies(1): >>41877908 #
1. Symbiote ◴[] No.41877908[source]
A good approach for this is to use pg_upgrade in-place, which should give you a downtime of a few minutes at most. (I have 800GB at work and would expect 1-2 minutes for this.)

I recommend installing PG12 on a temporary VM, duplicating the existing database, and test the upgrade in isolation.

https://www.postgresql.org/docs/current/pgupgrade.html

A more complicated approach uses replication, and upgrades the standby server before promoting it to the primary server.

replies(1): >>41878475 #
2. Netcob ◴[] No.41878475[source]
Thank you! Looks like the best way to do this.

And since I have backups, I might not even need the testing step, considering the low risk. Might do it anyway just out of curiosity at how long it would take to duplicate.