As I attended sessions at Microsoft TechEd 2014 last week and talked with people in the exhibit hall a number of themes emerged including “mobile first, cloud first”, hybrid cloud, migration to the cloud, disaster recovery as a service, and flash memory storage as a game-changer in the data center. But as I reflect on the entire experience, a statement made John Loveall, Principal Program Manager for Microsoft Windows Server during one of his presentations sums up to overall message of the conference, “Today it is really all about the integrated solution.”
Small and medium businesses (SMBs) are rapidly moving towards virtualizing their physical servers using VMware. But as they do so, they are also looking to minimize the cost, complexity and overhead that the backup of VMware servers introduces while increasing their ability to recover their newly virtualized applications. It is these concerns that InMage’s new vContinuum software addresses by using a new technique to tap into VMware that provides near zero impact backups with near real time recoveries.
Last week I took a look at the first three factors to consider when choosing a replication software product. This week I wanted to finish my thoughts around that subject and discuss the final four factors that should be part of any evaluation of replication software.
Replication is becoming an ever more important component in the protection and recovery of applications. Anecdotal evidence already suggests that 50% or more of all SAN and NAS storage systems ship with some form of replication software while many more organizations use replication in its other forms (application, appliance or host-based). But regardless of what form of replication software that organizations buy, they are many times unaware of the subtle ways in which replication software products differentiate themselves.
One of the principle struggles within organizations in the first decade of the new millennium has been solving Windows backup issues. Now that a new decade has arrived the problem has changed as organizations turn their attention to how they can recover their Windows application servers in a time frame and manner that meets their requirements. But to identify such a solution they first need to define what such a recovery solution should look like.
One of the key concerns that businesses have is how providers of the cloud will handle and respond to spikes in application demands. It is these questions that InMage’s newly announced cloud-optimized infrastructure is designed to answer.
These days it seems that all someone has to do is use the word “deduplication” in conjunction with a data protection product and that data protection product magically looks “better”. But what organizations have to be careful to do is not allow deduplication to color their view of what they hope to accomplish with the implementation of disk-based data protection. Rather organizations need to look at data protection from a different viewpoint that it is not tainted by deduplication and allows them to fully leverage the flexibility that disk-based backup provides.
There is a perception among enterprise organizations that in order to deploy continuous data protection (CDP) technology, they also need to use high performance disk in conjunction with it. But enterprises probably should re-assess that assumption. The emergence of new and better CDP architectures such as what InMage offers enables organizations to deliver high speed CDP while using slower performing SATA disk drives.
Here is what determines how much storage a CDP product needs. CDP initially needs an allotment of storage capacity that is equal to the size of the volume on which the data resides that is being protected. This is needed so the CDP product can make a copy of all of the blocks on the production volume. However, the wild cards in how much storage the CDP product requires are based not the size of the production volume but two other variables.
The introduction of disk and deduplication into the backup process over the last few years has certainly helped to minimize existing backup problems. Organizations using these technologies have found that their backup success rates now approach 100% and that they no longer have to continually troubleshoot backup problems. But while these technologies may fix existing backup problems, they relegate disk to a glorified form of tape and do not serve to fundamentally transform the recovery process.