Question :
We have a backup process set to copy files over to a different network. Occasionally we lose internet connection and then one of the backups files will fail to copy over. This becomes a bigger issue for larger files as they take much longer to copy over and if it needs to be copied again we need to wait a few hours. Our proposed solution is to split the backups across multiple files so that when it fails we only lose a smaller file.
Is there any other/better solutions to handle this type of issue?
Answer :
if you backup to multiple files the backup will normally run faster – so I think this is a win win suggestion. If you do the file copy with something like robocopy then there is retry built in.
Ultimately you need to look for a stable network/internet connection, because any alternative solution that you may think of (splitting backup into small files, having retry options from copy tool) would eventually depend on the network/internet.
However, splitting backup files and compress them locally (perhaps using 7-zip) prior to copy, can reduce the time of copy, so less chances of failure, but not complete solution!