My New Backup Policy

It’s been a while since I shared with you why backups and me do not mesh, and I even went as far as asking cloud services to encrypt my data for me. Today (well, it’s actually 1am in the morning, so this morning) I am going to share with you a slightly improved versioning of my backup methods.

This time, I have built in some complexity to my backup regime, and made it more robust. To summarize in the past, I have used applications such as Cobian, and EaseUS; Cobian’s VSS service kept failing me, and EaseUS was corrupting the data when writing, so I had 2 copies of corrupt data and for months was unaware. With my newest solution I aim to tackle this with the following:

  1. 4 jobs to individually copy files (explained later);
  2. 1 job to copy the entire 4 backup files to a NAS and;
  3. 1 job to copy the local files to another NAS drive.

With these jobs, the end files are being written in a native .zip extension, and an email notification with the job status has been sent out to confirm the outcome. Of course, this isn’t foolproof; but I am hoping that the 3 duplicated copies of the files, and the email notifications are enough to save me; heck, the program also checks the integrity of the files! So, I am going to detail this for you wonderful people.

Just warning you – this is going to be a lot of “this is the button I clicked, this is the screen” options below, but that’s life.

Having an escape medium

Before beginning, I needed to come up with a solution that would “get me out of trouble should this PC die”. Of course, the obvious solution to holding upwards of 40GB would be a NAS, or Network Attached Storage. Utilising an independent media allows me to segregate the damage from my PC to the NAS (Virus, power surge and water damage being the main causes).

The initial copy job should be local (as in, an internal drive) for the following reasons:

  1. You’re less likely to have corrupt backups local as opposed to network generated files;
  2. If the external media dies, you should have another copy of the data, and;
  3. You’ve got immediate access for recovery should you require it.

So with that, I re-purposed my Orico 2 Bay NAS (which has been awesome, if you’re after a cheap NAS bay- this does not have full “NAS” functionality) to be the medium for receiving the data. Because I encrypt my files, I needed to have a copy of AxCrypt and Symantec PGP Encryption in each directory I planned on using the backups in. This was my “escape medium”.

Under 4 directories (2x NAS Drives, local SSD and recovery USB) I housed the installers for AxCrypt and Symantec PGP. Note that because my backup solution uses .zip, I do not need it to extract anything.

Backup Software

In this example (and after over 25 hours of research) I have come to the conclusion that iPerius is one of the best tools out there for data backup. I so much enjoyed this software, I paid the $315.00USD figure to unlock all features (and “support development”).

I will go into example (step by step, with pictures) to identify how I performed my backup policies.

Selecting Directories

The first process for creating your backup files is to set your “working” directories; the folders you wish to backup. Because I use Symantec PGP Disks, I want to only backup the .pgd files, and not the entire directory.

With iPerius, I would apply the filter to only include the PGD file extension, as follows:


However, I also wanted to do a job per PDG file, so I added the other 3 PGD files to the exclusion list, as follows:







Why is this important?

The reason why this is so important is it demonstrates the robust ability for iPerius to not only include all file extensions, but also to remove specific files you do not want.

Destination Information

The next step is to state where you want the data stored. In this screen there are 3 important options I wanted to identify:

  1. How many copies of the same file did I want this job to keep;
  2. If I wanted the files compressed, and with a password and;
  3. What I wanted the file to be named.

As you can see, I want to keep ~10 copies (full, not incremental) of the backup, and I have named the file based on the following parameters:


Logging The Data

Next, we want to enable it to be autonomous; if you need to initiate the backup, you’ll probably forget.

I went and set my days, and time, and asked the program to create a log file per backup:


Then, I configured the program to send me an email upon completion of set job:

Email Settings

The settings for the email portion.

I want to pay special attention to the email it sends you. Not only can you customize the header, recipients and body, but the format is easy to understand, and all this is wrapped up in a nicely viewed status bar, to tell you where the job is at:


The encryption portion of the data

In the past I’ve explained why it’s important to secure personal privacy, and demonstrated how basic encryption works,  but never how I personally achieve this. I personally (a few years ago) purchased 15 copies of a PGP program for a customer, who never went through with the deal – so now I have copies!

Symantec PGP Disks are how I store (and encrypt) data. Similar to this paper on Whole Disk Encryption, my pretty good privacy application has a screen where it lists PGP disks I have created:


These files can be located in Explorer, as such:


To access the files (mounting them into memory) you simply need to enter the passphrase to the disk, with the following pop-up:


The reason why I use this tool? It’s reliable, and rather secure – if my understanding on how PGP works is accurate, anyway.


So there you have it, a simple backup policy, that has enough redundancy to keep me out of trouble! Now if only Gmail labels were useful… 😉

Automatic Backup of Android – Non-Rooted

I like to dabble around on Android Enthusiast, as part of a hobby. Recently, I had to pay an absurd amount of money to get the onboard units of my Samsung S7 Edge replaced because KNOX enabled it’s Custom Binary Lock upon reboot, not allowing me to enter my phone.

Apart from some minor technicalities, it got me thinking: how can I achieve a full device backup, without root privileges? The process is not quick, but I have found a few solutions for myself.

Whilst it would be preferable that the solution is entirely automatic, and not require a computer, there are limitations without root privileges.

Continue reading

Cloud Services, please encrypt locally beforehand.

I know that I made a post outlining why local backups aren’t for me, but they sort of are. The entire concept of “the cloud” can be rather complex, or simple, depending on how much you want to think about it – but in summary, it is defined as:

cloud service is any service made available to users on demand via the Internet from a cloud computing provider’s servers as opposed to being provided from a company’s own on-premises servers.

Storing items such as entire servers on AWS infrastructure, to personal data in a personal cloud storage service have all become popular in 2017 – even though a number of reputable cloud services have been compromised recently.

So, why? To many, it’s a simple method of storing data to be accessed via multiple devices, and is a form of “data backup”. Poppycock!

In this post I will briefly touch on some popular cloud providers, and some basic steps to secure your personal data.

Known Cloud Services Providers

Continue reading

Backups and Me Don’t Mesh. Here’s why.

It goes without saying, the content stored on most users computers (that is, in the user directory) is important, regardless the content. That’s why it is imperative to have frequent backups of the data should something occur, such as Crypto.

Nowadays, there is a plethora of cloud services readily available to store your data in “the cloud“, free of any dangers – or so they say. But does that mean the era of local backups redundant? No! You should still take action to secure the integrity of your data locally should there be any issues.

Continue reading