![]() The other difference is ensuring the script fails if the cd /srv/NAS/ command fails, which in this case is not a huge deal as the duplicacy job will fail…or maybe not if this script is executed from another repository location. The above means than you can use similar locking functionality for your prune process to ensure your backup and prune jobs are not overlapping. # Attempt to change to repository and bail out if its not possibleįlock -en /srv/NAS /usr/local/bin/duplicacy -log backup -statsĮcho "$(date) Stopped backing up $PWD. So reimplementing the backup script in the OP by using flock could be done as follows: #!/bin/sh The lock can be taken on any file or directory with zero impact on most other operations…unless your application tries to take its own file lock along the way. Once the process finishes the lock is released. $ bash /tmp/setlock.sh | bash /tmp/setlock.shĪ good way I have found to do locking is to use the handy flock command which will attempt to take a lock on a file or directory then execute another command. # which process wins may vary, but only one will win # Lock files exists but stale, except for second process, which can't obtain lock $ bash /tmp/setlock.sh & bash /tmp/setlock.sh # Lock file exists but stale in both cases (one process run after other) # Make sure our PID got in first (if two processes running in sync)įor example, put that in setlock.sh and try # No lock exists Read pid other /dev/null & return 1 # Locked No script will be fool proof but this should reduce instances of stale locks blocking the backup. This is code I use (written many years ago) for setting a lock in a script. Schedule the backup sudo crontab -eĪnd add something like 30 3 * * * /home/christoph/backup_NAS.sh > /srv/NAS/christoph/duplicacy/logs/NAS_backup_`date "+\%Y-\%m-\%d-\%H-\%M"`.log 2>&1 Save the script wherever it suits you and don’t forget to make it executable. usr/local/bin/duplicacy -log backup -statsĮcho "`date`" Stopped backing up $PWD. I want to keep logs of all automated backups and since I did not manage to achieve this on a single line in crontab, I use a script (which also adds some flexibility): #!/bin/sh You can either let your initial backup run through this way (in that case, perhaps use sudo duplicacy -log backup -stats & to let it run in the background) or you can stop it and let it be triggered according to schedule (see below). You should not be asked for any password. Run your first backupīefore running the backup automatically, I like to text it manually to see that things are working fine:ĭoing something like sudo duplicacy -log backup -stats There shall be no step 4 in this tutorial 5. This is probably not the most secure method, but I couldn’t figure out how to get environment variables to work with sudo… I guess security is okay, since only root has access to the preferences file) 3. Sudo duplicacy set -key password -value "this is my storage passphrase" Save your passwords into your preferences file sudo duplicacy set -key webdav_password -value "this is my webdav passphrase" Just ignore them and enter the requested passwords (which will trigger another error, but we’ll tackle that in the next step) 2. SecretService provider: dbus: DBUS_SESSION_BUS_ADDRESS not set ![]() ![]() You will probably get some errors like Reading the environment variable DUPLICACY_WEBDAV_PASSWORDįailed to store the value to the keyring: keyring/dbus: Error connecting to dbus session, not registering ![]() Sudo duplicacy -d init -e NAS sure you adjust the working directory and the init command to suit your needs. Initialize your repository (mine is called NAS) cd /srv/NAS/ If you are just backing up files from a single user, you can probably save yourself a lot of trouble by not using sudo. To start with: I’m running duplicacy as root so that I can backup files from different users in the same backup. (If you know a better/different way, please add it! Or if I made a mistake (I’m not a Linux expert!), please correct it.) There are certainly many ways of doing this but here is how I managed to schedule duplicacy on a Linux server (Ubuntu 18.04).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |