Server virtualization has effectively broken the one-to-one relationship between servers and applications, enabling more efficient use of the host’s physical resources. But this is not without its drawbacks, as applications like backup software that took advantage of these idle resources no longer have access to them. Today, I continue my interview series with Walter Angerer, Quest Software’s Senior VP of Data Protection, as we look at how backup software is evolving in light of the new challenges that server virtualization creates, in order to become smarter, more agile and do a lot more than backup.
Jerome: As backup software makes this transition from a “one size fits all” to a more server-centric view, in essence, it is becoming more modern. So should backup software remain classified as data protection software or does it now need to move into areas such as “data management,” “search,” or “recovery management”? Or is that outside the realm of what backup software should delve into?
Walter: That’s a very good question. One thing that’s clear is the current scope of data protection is insufficient to fully address today’s challenges. We have to deal with environments where some data resides on physical hosts, some data resides in virtual environments, and some data resides in the cloud – public, private, or both. That’s a complex challenge for any IT department.
So at the very minimum, I think the scope of data protection has to be expanded in a couple of directions. One, we have to make sure we’re more focused on adhering to our service level agreements (SLAs), especially when it comes to the recovery side of the equation.
That means we need our backup and recovery software to be more aware of, and integrated with, our applications, especially those that are mission-critical, and it means we need to be cognizant of which protection methodologies we apply, and truly understand whether or not those methods enable us to meet our SLAs.
Second, we need to focus on the management side of the equation as well. We need more awareness as to where our applications reside – whether they reside on physical hosts, virtual hosts, or in the cloud. When you have that level of awareness and understanding, you can then align and manage your resources accordingly, and ensure that admins have what they need to quickly recover those applications that are most critical.
Regarding search, I really haven’t heard customers clamoring for capabilities that go beyond what most enterprise backup and recovery software solutions already deliver. Most technologies today allow you to use keyword searches and other things of that nature, and that’s generally what customers are looking for. Now, when you look specifically at virtual backup and recovery, I think there’s still somewhat of a gap when it comes to search.
Most products on the market today don’t allow you to look inside an image-based backup. That’s not acceptable any more. There is so much agility in these environments that it’s imperative for organizations to know which host information resides on.
Jerome: You cannot talk about backup without talking about the impact that server virtualization has had on backup. Maybe more than anything else, virtualization has driven many of the changes we are seeing in backup today. Can you talk about some of the major new trends you are seeing in backup as a result of server virtualization, and what are some of the best practices for protecting a growing virtual environment in enterprise shops?
Walter: I think it’s important to first understand the benefits of server virtualization. Virtualization has grown in popularity because it enables companies to better utilize physical resources, and makes it possible to for them to get by with fewer physical servers.
When you think about physical environments, provisioning a new server can be a considerable undertaking, and it can take quite a bit of time. In a virtual environment, that’s not the case. So virtualization really delivers optimal resource utilization. It also enables a great deal of agility. That’s why you often see a large number of new VMs popping up on a consistent basis.
What’s interesting is the same things that make virtualization beneficial make it challenging for backup. For starters, in virtual environments, backup software must be much more agile, so that it can quickly detect the machines it needs to protect, and then protect them at the right level.
Secondly, whereas within physical environments there’s generally plenty of spare CPU and network bandwidth on the hosts, virtual hosts are pretty well loaded even before the backup kicks in. That means the backup can no longer afford to create a lot of overhead on the server.
So what companies really need is to have a solution in place that’s architected specifically to protect virtual environments. You want a solution that’s agent-less, so you can quickly adapt to a changing environment. You want to have capabilities like image-based backups, so you can run full VMDK or virtual machine restores, as well as granular restores.
You need to ensure you can provide LAN-free backups and restores, so that you avoid introducing any type of overhead within the virtual machine, and lastly, you need to make sure your solution keeps track of what it backs up, and where it came from, in order to facilitate fast, granular restores of the right data.
In Part I
of this interview series Walter and I discuss how backup is changing
and examine the quantum leaps forward that have occurred in how backup
and recovery are done.
In Part II
of this interview series, Walter and I will explore how backup software
needs to evolve to address new requirements to manage recovery as well
as the new challenges that Big Data is placing on data protection and
recovery.
In Part IV of this interview series, we explore whether or not virtualization only backup software solutions can survive long term.
In Part V of this interview series, Quest Software lays out its future plans for vRanger and NetVault Backup.