Data Protection Redesign is Top of Mind in 2009: Interview with Symantec SVP Deepak Mohan, Part I of III

Data protection is top of mind with more enterprise organizations today as they look to redesign data protection. Rapidly changing economic forces, new technologies and steadily growing volumes of data are prompting enterprises to rethink how they can best protect, manage and recover their data by leveraging these new technologies without introducing new people or extraordinary costs to accomplish these objectives.
To get Symantec’s take on these new challenges facing organizations, DCIG lead analyst, Jerome Wendt, recently met with Deepak Mohan, Symantec’s senior vice president of the Information Management Group, to discuss these topics. In this first blog entry of a 3-part series, Jerome and Deepak discuss some of the larger trends in data protection in 2009, how Symantec is delivering on these requirements now and what Symantec is doing to help organizations achieve these objectives.

Jerome: What do you see as the largest trends in data protection in 2009 and how is Symantec delivering on those specific trends?
Deepak Mohan: Symantec breaks trends into two categories: business trends and technology trends. In the current macroeconomic environment, cost is a big issue as budgets are going down and data volumes are going up. To manage this, more business consolidation is occurring but these consolidations are not without their challenges, especially when one considers that data center mergers for large financial companies are occurring as we speak.

From a technology perspective, we look at what analysts like TheInfoPro, who surveyed the top Fortune 1000 companies, and Gartner are saying. Their surveys show that “Redesigning Data Protection” is at the top of IT lists and where they are focused right now. In fact, this is even ahead of other areas like storage re-architecture, server virtualization and archiving.
Data protection redesign is at the top of these lists because enterprises have more platforms, more applications, more servers, more virtual machines, more databases – essentially more of everything – that are all spread out globally and need 24×7 availability. Aggravating the situation, recovery time objectives keep shrinking and downtime is not an option.

The sprawl of server virtualization further increases the complexity of data protection. Once virtual machines are deployed, they are hard to manage and it is difficult to pump all of that data to a single physical device. Virtual machines also move around between physical servers so it makes DR pretty complex.

Symantec also sees software as a service, or SaaS, emerging as a trend and, specifically, backup to the cloud. This is originating from the consumer space as consumers started backing up data in the cloud ahead of everyone else and we are now see this trend moving into small businesses which Symantec sees as a great way for SMBs to protect their data.

Jerome: More organizations are focusing on using their existing resources more efficiently and effectively. What features does Symantec deliver as part of its data protection solutions that will help organizations achieve those objectives?
Deepak Mohan: There are three things that customers care about today: Cost, utilization and time.  Customers want to reduce the cost of the management of their storage and data protection resources as they are watching their data volumes grow exponentially while their IT budgets shrink.  To do this, they need to utilize their resources better which storage and server virtualization technologies are helping to deliver. Since they have less time to manage these resources and enterprises need to recover their data more quickly, meeting recovery time objectives and SLAs becomes extremely important.

To deliver on these three objectives and become efficient in data protection at the same time, you have to architect your storage environment correctly and then leverage those technologies effectively. Right now that is not the case in most environments. According to research contained in Symantec’s recent State of the Data Center report, storage capacity continues to grow and the recent survey shows a median of about 200 TB of raw storage where about half of that storage is not utilized. So it first behooves our customers to stop buying storage.  There are many technologies that can help optimize existing systems and extend their usual life.
To reduce primary storage requirements and speed application performance, customers can leverage Symantec Enterprise Vault to optimize their email servers and file systems, giving world class archiving, search and e-discovery capabilities.  To reduce secondary storage requirements and speed backup performance, customers can utilize deduplication technologies like NetBackup PureDisk or a Data Domain appliance via the NetBackup OpenStorage interface to further improve the disk utilization in the data center.

Symantec solutions like Storage Resource Management, Veritas CommandCentral Storage and Veritas Storage Foundation can also help customers manage existing storage so they know where their assets are and can provision them intelligently as opposed to over allocating capacity. To virtually add storage headroom without buying new arrays, organizations can use Veritas Volume Manager thin provisioning feature to better utilize existing storage capacity across Linux, Unix and Windows platforms.  Managing storage resources more effectively reduces costs and the time associated with data protection, which makes IT organizations more efficient.

In part 2 of this 3-part series, Mohan provides some specifics on how Symantec differentiates itself from competitors in the market by providing customers a full ecosystem of integrated products that encompass data protection, data management and data security.

Click Here to Signup for the DCIG Newsletter!


DCIG Newsletter Signup

Thank you for your interest in DCIG research and analysis.

Please sign up for the free DCIG Newsletter to have new analysis delivered to your inbox each week.