As many new and existing vendors (Scale Computing, Simplivity, Pivot3, Nutanix) come out with these “Datacenter (DC) in a Box” and “Compute in a Can” types of solutions it is worth noting that these are not only for SMBs but also solutions that enterprise shops should consider as well.
90 Percent of Small and Midmarket Companies Want to Keep Their Critical Apps and Data Out of the Cloud; Scale Computing Interview Part II
There is a tendency among technology providers to sometimes pooh-pooh the virtualization needs of small and midsized businesses and only focus on the needs of the “really big enterprises.” However when one considers that the 900,000+ companies with 20-500 employees in Canada, the UK and US are less than 30% virtualized, a tremendous opportunity exists for the right technology provider to meet their specific needs.
Complexity in Midmarket IT Solutions Driving Need for the Hyper Converged Infrastructure; Interview with Scale Computing Part I
IT staff in midsized organizations face a peculiar challenge: it is expected to be masters of the technology in use at the organization as well as being up-to-speed on all internal business initiatives. To accomplish this twin feat, they need a new type of product that takes the best technologies available today, packages them as a single SKU and then makes it easy to install and manage.
Software-defined Storage is “Good” – Just Not All Versions of It May be Equally Well Suited for Your Organization
It seemed only moments after EMC announced its ViPR software-defined storage platform at EMC World this week that the attack dogs (primarily its competitors) were out in full force pointing out ViPR’s shortcomings and attacking its merits. But its competitors need to be careful how they go about discrediting EMC’s version of software-defined storage. EMC promoting it will lift the entire software-defined storage tide and help make it a viable option for end-users which many want and need.
The main theme at this year’s EMC World is “Lead the Transformation” that EMC is illustrating through the use of superhero characters. The superheroes are represented as end users who come up with solutions to manage today’s complex storage environment while the villain is pictured as “Doc Lock-in” who requires our superheroes to “lock-in” on a single vendor to mitigate this complexity. Yet for those users who think strategically about their storage acquisitions, Doc Lock-in may not be the full-fledged villain that EMC World portrays him to be.
About a decade ago, give or take a few years, a huge debate raged in the storage industry as to what was the best form of storage virtualization. However all that debate created over time was an equally large sense of fatigue with many people souring on the whole topic of storage virtualization. To resolve that, the term “storage virtualization” has been given a facelift at the 2013 EMC World and with it a politically correct name: Software Defined Storage – that is available from EMC as EMC ViPR.
Last week’s acquisition of NexGen Storage by Fusion-io was greeted with quite a bit of fanfare by the storage industry. But as an individual who has covered Fusion-io for many years and talked one-on-one with their top executives on multiple occasions, its acquisition of NexGen signaled that Fusion-io wanted to do more than deliver an external storage array that had its technology built-in. Rather Fusion-io felt it was incumbent for it to take action and accelerate the coming data center transformation that it has talked and written about for years.
The need of businesses for greater responsiveness from their IT departments is driving data center automation. Data center automation requires a new approach to network architecture that results in networks that are flat for high performance, multipath for high availability, and open to orchestration for quick provisioning and re-provisioning as application loads move within and among data centers.
Capacity-based Versus Socket-based Backup Software Licensing – Determining Which is Best for Your Organization
As recently as a few years nearly every backup software product licensed its software based upon criteria such as the number of servers protected and what applications their backup agents needed to protect. But with the rise of virtual machines (VMs) and the complexity that approach to licensing created, most have now switched – or at least offer as an option – the ability to do either capacity-based or socket-based backup licensing. As licensing is sometimes the issue that determines which product gets selected to perform backup in your environment, it is important to understand the pros and cons of each.
The Cloud is the Next Step After Virtualization; Executive Q&A with PHD Virtual CMO Steve Kahan Part IV
As companies of all sizes move from physical environments to ones that are more highly virtualized (or even entirely virtualized,) everything changes. While “how backups are done” is sometimes viewed as the biggest change, monitoring the virtual environment and leveraging the cloud are becoming higher priorities for end users. In this fourth and final blog entry in my interview series with PHD Virtual’s CMO Steve Kahan, he discusses how virtualization monitoring and the cloud are impacting the future of backup in general and PHD Virtual specifically.