The business case for organizations with petabytes of file data under management to classify and then place it across multiple tiers of storage has never been greater. By distributing this data across disk, flash, tape and the cloud, they stand to realize significant cost savings. The catch is finding a cost-effective solution that makes it easier to administer and manage file data than simply storing it all on flash storage. This is where a solution such as what Quantum now offers come into play.
Category: Tiered Data Systems
DCIG is pleased to announce the availability of the 2016-17 Hybrid Cloud Backup Appliance Buyer’s Guide developed from the backup appliance body of research. As core business processes become digitized, the ability to keep services online and to rapidly recover from any service interruption becomes a critical need. Given the growth and maturation of cloud services, many organizations are exploring the advantages of storing application data with cloud providers and even recovering applications in the cloud.
A storage decision that many small, midsize and large enterprise organizations are trying to make regards what type of array to host their production data on. This often comes down to the selection of either an all-flash or a hybrid storage array. Since most organizations do not have the luxury of saying, “Money is no object,” the majority are, for now, selecting hybrid storage arrays to get flash-like performance for their most active application data while using disk to store the bulk of their application data. It as organizations evaluate hybrid storage arrays that there are key factors that they need to consider.
An Omaha city employee recently gained unwanted public visibility after they sent twelve filing cabinets containing a hundred years of irreplaceable original building permits from the basement of City Hall to the county dump. It turns out that the head of the permits and inspections division decided to get rid of the cabinets as part of cleaning out its basement storage area. They did not realize that other city employees regularly pulled the permits, which dated from the 1880s through the 1980s. They were also apparently unaware that a local preservation group was developing a plan to move the permits to a new facility in order to make the permits more secure and accessible to the public.
Like Omaha’s City Hall, businesses often face what appear to be incompatible priorities. IT departments are expected to keep spending in check and know that only 10-20 percent of data is ever accessed after 60 days of its creation. But knowing which data to keep available and which data to delete or archive can be a challenge. This type of dilemma is one of many drivers in the development of a new group of storage systems–public cloud gateways.
At TechEd 2014 in Houston, TX this week, Microsoft made it clear that it is no longer content to just send customers to storage array vendors to meet their storage needs, especially when it comes to embracing a cloud-oriented approach to infrastructure. In the process of improving Windows storage technology, Microsoft is effectively delivering the benefits of–and addressing the barriers to–the adoption of server SAN technology.
DCIG has concluded our analysis of 41 hybrid storage arrays for the forthcoming DCIG 2014 Hybrid Storage Array Buyer’s Guide, As we reflected on the data we had collected, there were five features that stood out as distinguishing hybrid storage arrays from one another, and from all-flash arrays or traditional arrays.
Converged infrastructures are emerging as the next “Big Thing” in enterprise datacenters with servers, storage and networking delivered as a single SKU. Yet what providers are beginning to recognize – and what organizations should begin to expect – is that unprecedented jumps in application performance and resource optimization are now possible. The first examples of these jumps are seen in today’s ZS3 Storage Systems announcement from Oracle as it raises the bar in terms of how Oracle Database performance and resource utilization can be delivered while ushering in a new era of application-storage convergence.
As we were researching arrays for inclusion in the DCIG 2013 Flash Memory Storage Array Buyer’s Guide we kept encountering an intriguing group of companies that had designed–or were developing–storage arrays from the ground up to realize the performance benefits of an all flash array, but with storage capacities and price points that would bring the benefits of flash memory storage to a broader range of businesses. The resulting hybrid storage arrays achieve this balance of performance, capacity, and cost by intelligently combining flash memory with large capacity disk drives in a single storage system.
As we have been working on the development of a DCIG Buyer’s Guide for Hybrid Storage Arrays, it has been interesting to see the different approaches that the vendors are taking as they seek to leverage flash memory plus traditional hard drives to deliver previously unheard of IOPS and ultra-low latencies at a cost per GB that makes sense to a broad range of businesses. The “secret sauce” varies from vendor to vendor, but in every case it involves sophisticated caching and/or automated storage tiering software.
Over the years big data has crept into the everyday life of systems administrators. Attempts to solve the big data problem in both block and file storage emerged as data management software. While data management software struggled to get a footing, deduplication and compression took off stunting data management software’s growth.
Deduplication and compression technologies have well known capabilities in both the storage and information disciplines. However, they differ in a significant way. These technologies do not ease the burden of information management.