Quick Calculations Show that Enterprises Can Save Over a Million Dollars Using Archival Storage

Lately I have spent some time mulling over the results of the recent Symantec 2008 State of the Data Center . Specifically, I have been examining how much money an enterprise can potentially save by starting to place data on different storage tiers instead of following the path of placing most of their data on primary storage as many do now. What I quickly discovered just doing some back of envelope calculations was that an enterprise with 100+ TB of storage could potentially realize over a million dollars in storage savings by placing the right data on the right tier of storage.
To arrive at this conclusion, I made some of the following assumptions. First, I figured that most enterprise organizations are not optimizing the placement of their production data for most of their applications. Instead, they are following the path of least resistance so most production data (about 95%) ends up on primary storage while maybe 5% is archived.
Using these percentages, I did some back of envelope calculations where I plugged 128 TB of raw storage into the formula (this is the amount of storage that the median size organization possesses according to the latest results from 2008 State of the Data Center report). Using this 128 TB number, I determined that these organizations have spent approximately $3,126,067 for all of their storage assuming that they purchased their archival storage at $2/GB and their primary storage at $25/GB.
Please note that I am not saying it is wrong for organizations to spend this much money on primary storage, especially if they are using it for the purposes for which it is intended. But my own experience tells me that organizations do not have a good handle on the performance characteristics of their data on primary storage and recent studies support that. A recent InformationWeek  by George Crump suggests that 70% of all data and as much as 85% of all data on primary storage has not been accessed in the last 90 days. Now whether or not all of that data belongs on archival storage is subject to debate but certainly most of it does not belong on primary storage.
Further, it can be argued that even primary storage is not meeting the highest performance requirements of some organizations. I just spoke to an end user the other day that has an entire team of storage experts constantly tuning the placement of data on their primary storage systems to meet the performance requirements of some of its applications. This involves short stroking on the disk drives, job scheduling to avoid overlapping jobs and balancing disk sizes and storage system memory to achieve optimal performance for an application. These environments strike me as prime candidates for Tier 0 storage (Solid State Disk) assuming they could find a way to cost justify it.
Well, using archival storage, they just might be able to do that. I did some further calculations assuming that 75% of all data is placed on archival storage (again $2/GB), 5% is placed on Tier 0 or SSD storage ($100/GB) and only 20% remains on primary storage (again $25/GB). I again used the 128 TB number of a median sized organization and the resulting savings surprised even me as the storage costs dropped to $1,409,024. Assuming an organization with this much storage achieves these percentages, it will result in:

  • Storage savings of over $1.2 million dollars
  • The introduction of 6 TB of Tier 0 storage into their environment.
  • Savings that could approach $2 million dollars if no Tier 0 storage is needed by the organization and it increases its percentage of archival storage to 80%.

Granted, there may be some other costs involved to achieve this improved level of storage management but will they outweigh the cost benefits that this new model of storage management so clearly provides? That is up for each organization to decide but when you look at an archival storage platform such as the Permabit  that delivers every feature that organizations need for this tier of storage (high availability, redundancy, data integrity and reliability) at a fraction of the price of primary storage, it should force organizations with this much data to re-examine the role that archival storage systems currently play in their infrastructure. As is demonstrated by the example above,  the better job that enterprises do in aligning their cost and usage, the more effective they become in improving performance of their applications and stretching their IT budget dollars

Share
Share

Click Here to Signup for the DCIG Newsletter!

Categories

DCIG Newsletter Signup

Thank you for your interest in DCIG research and analysis.

Please sign up for the free DCIG Newsletter to have new analysis delivered to your inbox each week.