Hi folks,
I'm writing a backup script which creates incremental TAR archives, encrypt them with gpg and uploads the encrypted archives to rapidshare. This should create a desaster recovery backup.
The script works fine while the archive size is small. As soon as the backup volume goes up (e.g. 700mb of jpegs) the tar process need nearly infinite time and the box goes down. Checking memory consumption in a second shell I see that nearly every free memory is eaten up by tar.
Is this a known problem? With solution?
The tar command I used:
tar -cz -f $TGZFILE -g $SNARFILE $DIRECTORIES
No differences if I discard the compress switch 'z'.
Callidus