Hi fellow self-hoster.
Almost one year ago i did experiment with Immich and found, at the time, that it was not up to pair to what i was expecting from it. Basically my use case was slightly different from the Immich user experience.
After all this time i decided to give it another go and i am amazed! It has grown a lot, it now has all the features i need and where lacking at the time.
So, in just a few hours i set it up and configured my external libraries, backup, storage template and OIDC authentication with authelia. All works.
Great kudos to the devs which are doing an amazing work.
I have documented all the steps of the process with the link on top of this post, hope it can be useful for someone.
How did you do external backups?
I used to use a docker container that makes db dumps of the database and drops it into the same persistent storage folder the main application uses. I use this for everything in docker that had a db.
Immich as recently integrated this into the app itself so its no longer needed.
All my docker persistent data is in a top level folder called dockerdata.
In that I have sub folders like immich which get mounted as volumes in the docker apps.
So now I have only 1 folder to backup for everything. I use zfs snapshots to backup locally (zfs auto shot) and borgmatic for remote backups (borgbase).
All my dockers all compose files that are in git.
I can restore he entire server by restoring 1 data folder and 1 compose file per stack.
I don’t understand how that’s helpful. If something is corrupted or my house burns down, a local backup is going to go with it. That’s why I asked for external backups.
Borgbase is remote
I backup with restic the database backups done by immich, not the database itself, and the Library/library folder which contains the actual images and videos.
Did you read the whole post? If so, did you go to his/her website?
Yeah I just didn’t understand it.
If anyone’s interested, here’s my Immich backup script. You setup rclone to use an S3 storage service like BackBlaze which is quite cheap. I also use a crypt which means RClone will encrypt and decrypt all files to/from the server. S3 configuration and crypt setup.
Then set this up as a cron job. With the “BACKUP_DIR” option when you delete a photo it will get moved to the “deleted” folder. You can go into your S3 provider’s lifecycle settings and have these get deleted after a number of days. I do 10 days. Or you can skip that and they’ll be gone forever.
#!/bin/bash SRC_PATH="/path/to/immich/library" DEST_REMOTE="b2crypt:immich-photos/backup" BACKUP_DIR="b2crypt:immich-photos/deleted" RCLONE_OPTIONS="--copy-links --update --delete-during --backup-dir=$BACKUP_DIR --suffix `TZ='America/New_York' date +%Y-%m-%d`.bak --verbose" rclone sync $SRC_PATH $DEST_REMOTE $RCLONE_OPTIONS
Yeah, I don’t know what any of these words mean. I just want to click “export” and back all the data up to a flash drive. Is that too much to ask?
One rclone command isn’t much more complicated than one button.
Reading the comment I replied to, it appears to be much much more complicated. And I don’t understand how anyone can claim otherwise.
Key word is “appears”. Choose your source and destination, run rclone. That’s it. No harder than going to the page, clicking export, picking a folder, save. It’s really not hard at all, give it a try.
This tells me absolutely nothing about how to do that. Source for what? Destination for what? Choose them where? What is rclone? Where do I get it? How do I run it? What does it do?
All questions that don’t need to be answered before clicking a button in the UI.
Well yeah you could go on the site and select whatever photos and hit download I suppose.
There’s no way to do that for your entire library. Also I assume that would not retain the Immich-specific metadata like the ML object tags and the “people” tagged in the photos.
You should have a backup solution for your server that should cover this, without that you should probably stick with managed photo backup services.
Thats…why I’m asking?
…is that not what Immich is?