I received an interesting email last week Friday. StrongLink, a company that I am joining for a webinar on March 15, at 12 noon EDT, on intelligent data management, shared with me an almost unbelievable screenshot (pictured below.) It displayed a job from its offering that had just scanned a file store that was “1.8 EB” in size. I did a double-take to say the least. 1.8 EB?! Did I read that correctly and did it really mean a 1.8 exabyte file store?! As a matter of fact, yes, yes it did.
The Look of Disbelief
I suppose everyone involved in IT manages storage at some level. However, I got serious about managing it in the early 2000’s when I assumed the role of a Storage Administrator at a Fortune 500 company. At that time, when I told people I was responsible for managing over 10TBs of storage, people almost looked at me in disbelief.
Fast forward to today. No one was around me when I first saw the photo attachment in the email that showed a single file store was 1.8EB in size. However, had they seen me, they likely would have seen a similar look of disbelief on my face.
Managing Exabytes of Data – Believe It
Seeing that 1.8EB file store made me realize that managing exabytes of data should no longer seem unbelievable for any size organization. Due to the multiple sources from which any organization may gather data, getting to a single EB no longer seems unattainable.
Do not misunderstand me. In no way do I expect some or even most organizations to manage exabytes of data today. However, 20 years ago 10TBs of storage seemed like a lot. Now I have nearly that amount (if not more) installed in the computers in my home.
As such, it is probably not that farfetched to believe in another 20 years that we may each have 10EBs of storage in our home office. While I for one still cannot see how that might occur, who knows what new technologies may come along that require that much storage capacity.
Time for Intelligent Data Management
But some of you are like I was 20 years ago. You are 20 years ahead of your time and already manage hundreds of petabytes of data, or even exabytes, as clearly some individuals do. If that’s the case, you need to get smart about storing and managing all this data and the storage it uses.
That’s why next Wednesday, March 15, at Noon EDT, I am joining the Neola Group and StrongLink for a webinar. The webinar will focus on Managing Hybrid Cloud Data Sprawl in Government and Research Environments.
Government and research organizations already face tomorrow’s data management challenges today. As a result, they need to introduce intelligence into their infrastructure that does more than position them to place and find their data. They need a solution that helps them place it on the most appropriate storage tier – be it cloud, disk, or tape.
At the same time, the solution needs to abstract away the complexities of managing the storage products under its control and their storage tiers. Otherwise, the costs and complexity of managing the “cheap storage” will outstrip whatever cost savings one might hope to realize.
If you find yourself in this future situation today, I encourage you to join myself, the Neola Group, and StrongLink for this upcoming webinar. In it, you can learn how you may be able to utilize the latest intelligent data management technologies to improve your current storage dilemma.
KEEP UP TO DATE WITH DCIG
If interested in learning more, DCIG is currently researching all-flash arrays (AFA), cloud object storage, M365 backup solutions, and software-defined HCI solutions. It will soon release one or more TOP 5 reports on the best offerings in each of these categories for various use cases.
To be notified of new DCIG articles, reports, and webinars, sign up for DCIG’s free weekly Newsletter.
To learn about DCIG’s forthcoming research and publications, see the DCIG Editorial Calendar.
Technology providers interested in licensing DCIG TOP 5 reports or having DCIG produce custom reports, please contact DCIG for more information.