Most organizations already use a hybrid cloud in some way. Of the cloud’s available offerings, cloud object storage represents the one that many use first. Organizations often initially move and store their archival and backup data on cloud object storage. Yet when they do, many also experience sticker shock. To minimize these costs, organizations need to choose a deduplication solution that embraces a hybrid cloud strategy.
Every now and then a phrase shows up in the tech industry that may hurt more than it helps. ‘Instant restore’ represents one of those terms. In my mind, the word ‘instant’ implies ‘occurs in a moment.’ In practical terms, from a technical perspective, that translates into a few seconds or perhaps up to a minute or two. However, too many deduplication appliance providers use the phrase ‘instant restore’ to describe a feature of their product. Unfortunately, the use of this term may leave organizations with the impression these products instantly restore more swiftly and robustly than they do.
More companies than ever want to use the cloud as part of their overall IT strategy. To do so, they often look to achieve some quick wins in the cloud to demonstrate its value. Achieving these quick wins also serves to give them some practical hands on experience in the cloud. Incorporating the cloud into your backup and disaster recovery (DR) processes may serve as the best way to get these wins.
Scalable data protection appliances have arguably emerged as one of the hottest backup trends in quite some time, possibly since the introduction of deduplication into the backup process. These appliances offer backup software, cloud connectivity, replication, and scalable storage in a single, logical converged or hyperconverged infrastructure platform offering that simplify backup while positioning a company to seamlessly implement the appliance as part of its disaster recovery strategy or even create a DR solution for the first time.
Companies are always on the lookout for simpler, most cost-effective methods to manage their infrastructure. This explains, in part, the emergence of scale-out architectures over the last few years as a preferred means for implementing backup appliances. It is as scale-out architectures gain momentum that it behooves companies taking a closer look at the benefits and drawbacks of both scale-out and scale-up architectures to make the best choice for their environment.
One would think that with the continuing explosion in the amount of data being created every year, the number of appliances that can reduce the amount of data stored by deduplicating it would be increasing. That statement is both true and flawed. On one hand, the number of backup and storage appliances that can deduplicate data has never been higher and continues to increase. On the other hand, the number of vendors that create physical target-based appliances dedicated to the deduplication of backup data continues to shrink.
There is little dispute tomorrow’s data center will become software-defined for reasons no one entirely anticipated even as recently as a few years ago. While companies have long understood the benefits of virtualizing the infrastructure of their data centers, the complexities and costs of integrating and managing data center hardware far exceeded whatever benefits that virtualization delivered. Now thanks to technologies such as such as the Internet of Things (IoT), machine intelligence, and analytics, among others, companies may pursue software-defined strategies more aggressively.
In the last few years all-flash arrays have taken enterprise data centers by storm but, as that has occurred, the criteria by which organizations should evaluate storage arrays from competing vendors have changed substantially. Features that once mattered considerably now barely get anyone’s attention while features that no one had knowledge of a few years ago are closely scrutinized. Here are three features that organizations should examine on all-flash arrays and one feature that has largely dropped off the radar screen in terms of importance.
One of the more perplexing challenges that Nutanix administrators face is how to protect the data in their Nutanix deployments. Granted, Nutanix natively offers its own data protection utilities. However, these utilities leave gaps that enterprises are unlikely to find palatable when protecting their production applications. This is where Comtrade Software’s HYCU and ExaGrid come into play as their combined solutions provide a more affordable and elegant approach to protecting Nutanix environments.
Vendors first started bandying about the phrase “cloud data management” a year or so ago. While that phrase caught my attention, specifics as what one should expect when acquiring a “cloud data management” solution remained nebulous at best. Fast forward to this week’s Veritas Vision 2017 and I finally encountered a vendor that was providing meaningful details as to what cloud data management encompasses while simultaneously performing a 180 behind the scenes.