DCIG is pleased to announce the immediate availability of the 2022-23 DCIG TOP 5 Storage for Life Sciences Solutions report. DCIG evaluated fifteen providers offering storage solutions across six different categories. This report provides guidance on the TOP 5 rising vendors providing storage solutions organizations should consider for the unique storage solutions life science organizations require.
Category: Big Data
Everyone tends to think of Big Data as only being a challenge in Big Organizations. No more. Small and midsize organizations face their own set of big data challenges. Even mobile, remote, and branch offices of large organizations face big data challenges like these smaller, autonomous organizations. To avoid being caught unprepared, there are three specific areas these smaller shops should flag as areas for data growth.
In a recent interview, James Coomer, Senior Vice President for Products at DDN Storage, revealed how the largest private storage company in the world is now bringing its HPC expertise to the enterprise, and enterprise features to HPC.
Enterprises of all sizes are using unstructured file data in new and varied ways to drive innovation and create value. Whether they are optimizing retail operations through AI-enabled video analytics or improving healthcare through an in-depth analysis of high-resolution microscopy images, they are capturing file data from more sources and in a greater variety of file sizes than in the past. In turn, they must store multiple petabytes of data to support these new workloads. They need petabyte-scale file data management.
Mention data management to almost any seasoned IT professional and they will almost immediately greet the term with skepticism. While organizations have found they can manage their data within certain limits, when they remove those boundaries and attempt to do so at scale, those initiatives have historically fallen far short if not outright failed. It is time for that perception to change. 20 years in the making, Commvault Activate puts organizations in a position to finally manage their data at scale.
HP StoreOnce Deduplicating Backup Appliances Put Organizations on Path to Ending Big Data Backup Headaches
During the recent HP Deep Dive Analyst Event in its Fremont, CA, offices, HP shared some notable insights into the percentage of backup jobs that complete successfully (and unsuccessfully) within end-user organizations. Among its observations using the anonymized data gathered from hundreds of backup assessments at end-user organizations of all sizes, HP found that over 60% of them had backup job success rates of 98% or lower, with 12% of organizations showing backup success rates of lower than 90%. Yet what is more noteworthy is through HP’s use of Big Data analytics, it has identified large backups (those that take more than 12 hours to complete) as being the primary contributor to the backup headaches that organizations still experience.
Software Fueling Dell’s Transformation to Solutions Provider; Interview with Dell Software’s General Manager, Data Protection, Brett Roscoe, Part VI
Think “Dell” and you may think “PCs,” “servers,” or, even more broadly, “computer hardware.” If so, you are missing out on one of the biggest transformations going on among technology providers today as, over the last 5+ years, Dell has acquired multiple software companies and is using that intellectual property (IP) to drive its internal turnaround. In this sixth installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, we discuss how these software acquisitions are fueling Dell’s transformation from a hardware provider into becoming a solutions provider.
Deriving value from the plethora of unstructured data created by today’s multiple sources of Big Data hinges on analyzing and acting on it in real-time. To do so, enterprises must employ a solution that analyzes Big Data streams as they flow in. Using TIBCO Software’s Event Processing platform, enterprises can process Big Data streams while they are still in motion providing real-time operational intelligence so they may take the appropriate action while the action still has meaningful value.
Around two years ago the DCIG 2011 Enterprise Scale-Out Storage Buyer’s Guide was released. At the time we mentioned that scale-out systems were being used to store “Big Data” and create private storage clouds. Since then scale-out storage systems have become the foundation for building out private storage clouds which prompted DCIG to change the name of our refreshed Buyer’s Guide to better reflect the intended use case for these storage arrays.
Mention the year 2008 or 2009 to almost any person and it almost inevitably elicits a negative reaction in terms of how those years were from a business perspective. However as DCIG renews its annual tradition of reflecting back on what blog entries were most read on its website during the course of 2012, 2008 and 2009 emerged as very good years in terms of DCIG providing content that is still relevant and frequently read in 2012. Today and over the next four (4) business days, I will share what blog entries garnered the most attention on DCIG’s website in 2012.