Perhaps nowhere does the complexity of the IT infrastructure within today’s organizations come more clearly into focus than when viewed from the perspective of data protection. Backup and recovery software sees first hand all of the applications and operating systems in an enterprise’s environment . Yet, at the same time, it is expected to account for this complexity by centralizing management, holding the line on costs, and simplifying these tasks even as it meets heightened end-user demands for faster backups and recoveries. To break through this complexity, there are three tips that any organization can follow to help both accelerate and simplify the protection and recovery of data in their environment.
Toward the end of April Wikibon’s David Floyer posted an article on the topic of server SANs entitled “The Rise of Server SANs” which generated a fair amount of attention and was even the focus of a number of conversations that I had at this past week’s Symantec Vision 2014 conference in Las Vegas. However I have to admit, when I first glanced at some of the forecasts and charts that were included in that piece, I thought Wikibon was smoking pot and brushed it off. But after having had some lengthy conversations with attendees at Symantec Vision, I can certainly see why Wikibon made some of the claims that it did.
Enterprises view in-server flash with more than a wary eye. On one hand, they see how it tremendously accelerates application performance at a fraction of what flash on storage arrays costs. Yet on the other hand, data on in-server flash is no longer centrally stored so it must be managed as a “one-off” which detracts from its appeal. Leveraging the new SmartIO feature in Storage Foundation 6.1, organizations can begin to realize the best of what in-server flash offers while eliminating this drawback.
Many enterprises have been watching the development of flash with a high level of interest though they have cautiously deployed it as a storage tier because of its high cost. Symantec Foundation Suite 6.1 takes these concerns head-on by delivering functionality that will inherently change the way enterprises design, implement, and maintain shared storage environments. In particular, Storage Foundation’s new Flexible Storage Sharing (FSS) feature provides organizations the flexibility to non-disruptively put capacity, performance or both in their servers while still making it accessible to all of the applications in that cluster.
It is no secret that almost every organization regardless of its size has growing amounts of unstructured data that reside almost everywhere. The BIG unknown is what useful information, if any, these data repositories contain and what value, cost or risk they present. Using Symantec Data Insight 4.0, organizations can better understand the data that resides within their Dark Data repositories, the context in which it is being used, and then take informed actions to better manage and secure this data.
The storm season is once again upon us and it looks like it will be another one for the record books as evidenced by the tornados that have already hit Oklahoma and many other states. In fact, if you live in the Midwest and particularly in eastern and southeastern Nebraska, this picture probably has an all too familiar look to it. In this particular case, it’s time to batten down the hatches and get ready for a rough ride
Virtualization and Windows Server 2012 Adoption Poised to Explode over Next Two Years in Small and Midsized Enterprises
How fully virtualized organizations are is often a calculated guess based on anecdotal evidence or surveys conducted by virtualization providers who just sample organizations who already use virtualization. King Research’s Windows Server 2012 Migration/Virtualization Survey, commissioned by Symantec, eliminates much of this guesswork and built-in bias. However it more importantly provides key insights into just how virtualized organizations are right now, how quickly they plan to virtualize their environments in the next few years and how they would prefer to protect them once virtualized.
A reality check is going on in enterprises when it comes to cloud backup. While the vast majority recognize its value and are aggressively adopting it at many levels, the intangible issues of recovery and support tend to rear their head and preclude these enterprises from to date adopting a core cloud offering: cloud backup. It is these concerns that IBM and Symantec are teaming up to tackle so that enterprises may confidently do more than backup to the cloud – they can recover their data once it is in the cloud with a process that is supported end-to-end.
VMware VMFS Multi-Writer Flag Feature Throws Open the Doors to Using Cluster File Systems on Virtualized Highly Available Applications
Using cluster file system software on virtual machines (VMs) in VMware environments has always been a bit problematic at best. While it could be done with techniques like Raw Disk Mappings (RDMs) and 3rd party cluster file system software, organizations need to sacrifice “desirable” virtualization features like vMotion to achieve it.
Virtualizing applications such that it results in the use of fewer servers makes great sense. Applications are centralized. Hardware is more efficiently used. Data center floor space is freed up. Virtual machine (VM) loads may be more efficiently and non-disruptively redistributed between physical systems. But then the realization hits. You have put all of your proverbial eggs in one basket and unless you have a real or near real-time copy of this data off-site, should a major disaster hit, your goose is cooked. The question then becomes, “What is the best way to get this data off-site?”