My recent coding is a bash script using s3cmd to sync the CycleChat server with an Amazon S3 bucket; which will be fired-off by cron in the early hours of every Sunday morning (the quietest time on the server).
It's taken some doing as certain files in /proc and /var/lib/elasticsearch crash and halt the sync and the --exclude parameter doesn't work for some strange reason, but after some experimenting I managed to exclude them by pushing separate sync's only for the directories I do want to backup within /var
The longest part has been seeding the initial backup - 36 hours and still going. CycleChat's /home directory comes in at 27GB with nearly half a million files, and that's without the SQL database (another 6.7GB) or the rest of the server contents.
It's taken some doing as certain files in /proc and /var/lib/elasticsearch crash and halt the sync and the --exclude parameter doesn't work for some strange reason, but after some experimenting I managed to exclude them by pushing separate sync's only for the directories I do want to backup within /var
The longest part has been seeding the initial backup - 36 hours and still going. CycleChat's /home directory comes in at 27GB with nearly half a million files, and that's without the SQL database (another 6.7GB) or the rest of the server contents.