It’s been a while since I shared with you why backups and me do not mesh, and I even went as far as asking cloud services to encrypt my data for me. Today (well, it’s actually 1am in the morning, so this morning) I am going to share with you a slightly improved versioning of my backup methods.
This time, I have built in some complexity to my backup regime, and made it more robust. To summarize in the past, I have used applications such as Cobian, and EaseUS; Cobian’s VSS service kept failing me, and EaseUS was corrupting the data when writing, so I had 2 copies of corrupt data and for months was unaware. With my newest solution I aim to tackle this with the following:
- 4 jobs to individually copy files (explained later);
- 1 job to copy the entire 4 backup files to a NAS and;
- 1 job to copy the local files to another NAS drive.
With these jobs, the end files are being written in a native .zip extension, and an email notification with the job status has been sent out to confirm the outcome. Of course, this isn’t foolproof; but I am hoping that the 3 duplicated copies of the files, and the email notifications are enough to save me; heck, the program also checks the integrity of the files! So, I am going to detail this for you wonderful people.
Just warning you – this is going to be a lot of “this is the button I clicked, this is the screen” options below, but that’s life.
Having an escape medium
Before beginning, I needed to come up with a solution that would “get me out of trouble should this PC die”. Of course, the obvious solution to holding upwards of 40GB would be a NAS, or Network Attached Storage. Utilising an independent media allows me to segregate the damage from my PC to the NAS (Virus, power surge and water damage being the main causes).
The initial copy job should be local (as in, an internal drive) for the following reasons:
- You’re less likely to have corrupt backups local as opposed to network generated files;
- If the external media dies, you should have another copy of the data, and;
- You’ve got immediate access for recovery should you require it.
So with that, I re-purposed my Orico 2 Bay NAS (which has been awesome, if you’re after a cheap NAS bay- this does not have full “NAS” functionality) to be the medium for receiving the data. Because I encrypt my files, I needed to have a copy of AxCrypt and Symantec PGP Encryption in each directory I planned on using the backups in. This was my “escape medium”.
Under 4 directories (2x NAS Drives, local SSD and recovery USB) I housed the installers for AxCrypt and Symantec PGP. Note that because my backup solution uses .zip, I do not need it to extract anything.
In this example (and after over 25 hours of research) I have come to the conclusion that iPerius is one of the best tools out there for data backup. I so much enjoyed this software, I paid the $315.00USD figure to unlock all features (and “support development”).
I will go into example (step by step, with pictures) to identify how I performed my backup policies.
The first process for creating your backup files is to set your “working” directories; the folders you wish to backup. Because I use Symantec PGP Disks, I want to only backup the .pgd files, and not the entire directory.
With iPerius, I would apply the filter to only include the PGD file extension, as follows:
However, I also wanted to do a job per PDG file, so I added the other 3 PGD files to the exclusion list, as follows:
Why is this important?
The reason why this is so important is it demonstrates the robust ability for iPerius to not only include all file extensions, but also to remove specific files you do not want.
The next step is to state where you want the data stored. In this screen there are 3 important options I wanted to identify:
- How many copies of the same file did I want this job to keep;
- If I wanted the files compressed, and with a password and;
- What I wanted the file to be named.
As you can see, I want to keep ~10 copies (full, not incremental) of the backup, and I have named the file based on the following parameters:
Logging The Data
Next, we want to enable it to be autonomous; if you need to initiate the backup, you’ll probably forget.
I went and set my days, and time, and asked the program to create a log file per backup:
Then, I configured the program to send me an email upon completion of set job:
I want to pay special attention to the email it sends you. Not only can you customize the header, recipients and body, but the format is easy to understand, and all this is wrapped up in a nicely viewed status bar, to tell you where the job is at:
The encryption portion of the data
In the past I’ve explained why it’s important to secure personal privacy, and demonstrated how basic encryption works, but never how I personally achieve this. I personally (a few years ago) purchased 15 copies of a PGP program for a customer, who never went through with the deal – so now I have copies!
These files can be located in Explorer, as such:
To access the files (mounting them into memory) you simply need to enter the passphrase to the disk, with the following pop-up:
The reason why I use this tool? It’s reliable, and rather secure – if my understanding on how PGP works is accurate, anyway.
So there you have it, a simple backup policy, that has enough redundancy to keep me out of trouble! Now if only Gmail labels were useful… 😉