“Impossible. Nobody can jump this.” Enterprise technology buyers and IT administrators who have seen the movie Indiana Jones and the Last Crusade can relate to how Indy feels as he looks across a great chasm and is asked to step out in faith. But too often these individuals may feel the same way when asked to make an enterprise technology buying decision with little more information than what Indy possessed.
In today’s highly sophisticated age buying decisions of any enterprise technology should be based on sound facts and research. Yet as this YouTube video from that scene in the movie illustrates, they are too often armed with only anecdotal information as to what is the best next step to take.
Part of the problem is that they are not always confident that the solution they are considering will solve the problem. Even though they know what the symptoms are, they have little or no documentation that assures them the proposed solution will address the heart of the problem.
This puts them in the same predicament as Indy: forced to largely rely on their own experience and others’ recommendations and feedback when making a buying decision. As a result “leaps of faith” occur far too often.
These leaps are particularly unacceptable when purchasing enterprise data protection solutions. These solutions are often intended to work with multiple applications, meet competing requirements and handle data volumes and complexity that outstrip the capabilities of most solutions.
So for enterprises to move from making “leaps of faith” to leaping ahead, it is incumbent they have well-defined process and procedures in place to choose the right data protection solution.
In that vein, ten steps to accomplishing this appeared in a recent SEPATON white paper. While you can download the entire white paper to read all of the steps, four particularly resonated with me.
- Define your current environment. You would think that every enterprise has documented its current environment and keeps that documentation up-to-date. But this process of documenting the environment occurs infrequently and sometimes not at all. This occurs partly because enterprises assume they know what they have (which is rarely the case) or it is too much work.
However a tip someone gave me a few years back to help expedite and simplify the documentation process and which is still has merit today is to use the current backup software. It likely already touches most of the application servers in your environment, provides some guidance as to how much data is being backed up and even insight into what an organization’s data change rates are.
Another valuable resource is any internal infrastructure management software catalogs your environment. Two logical places to turn here are VMware vCenter for virtual machines as well as whatever software that the enterprise may already use to manage the physical infrastructure (CA Infrastructure Management, HP Asset Manager.) Utilities like these can also help to quickly produce a list of physical and virtual assets that are needed to help define and understand your infrastructure.
- Ensure your scalability needs are met. “Start small, grow big” continues to be a message that resonates in enterprise shops. They want a solution that they can deploy in areas where it has limited impact on production applications but as it proves itself, they can quickly and appropriately scale it to meet their enterprise application workloads
This particularly applies to an enterprise data protection solution. Once an enterprise has confidence that it can backup, deduplicate, replicate and restore data, the last thing an enterprise wants to do is replace it. Rather they want to scale it out and use it to meet as many of their needs as possible.
- Test reporting and dashboard capabilities. Today’s backup solutions typically use disk as a backup target and deduplication to minimize data stores. This has had the net result of these solutions requiring far less day-to-day oversight and maintenance than with past approaches. However it does not eliminate the need to manage them. It simply means that organizations need to take a proactive as opposed to a reactive approach when managing them
For example, since tasks like troubleshooting backup failures and managing tape cartridges are largely eliminated, enterprises may turn their attention to monitoring and managing capacity growth. This requires that the solution provide improved dashboard and reporting capabilities so they can monitor:
- Which application data types deduplicate well and which ones do not
- What overall and specific application data growth rates are
- Data change rates in the environment
- Create realistic deduplication ratios for each application type
Putting the dashboard and reporting in place transforms how data protection solutions are managed from that of firefighting to one of capacity planning. As such, enterprises may now use the information in these dashboards to continually optimize their current environments as well as better forecast what they should purchase next, how much they should purchase and when they should procure it.
- Evaluate the vendor staff. Every enterprise wants good and ideally great technology but enterprise data protection environments are notoriously complex to configure, manage and troubleshoot. So even with the new and improved approaches that today’s data protection solutions offer, enterprises still need to verify that vendors have the right people in place to align the technology with their environment
This encompasses more than just making sure the vendor staff has the appropriate levels of expertise and experience in its product. These individuals also need to understand the enterprise’s applications, their data protection requirements and the often unspoken idiosyncrasies of how each enterprise data center operates.
Enterprise organizations are at a point where they want and need to move beyond taking leaps of faith with their data protection purchases. Instead they need to leap ahead and take advantage of the new right new technologies for them. However that will happen only when they put in place well defined processes and procedures that ensure the selected solution meets their requirements.
These processes will help to ensure they correctly test each product’s capabilities and can document and understand its upper limits of the products’ performance, capacity, and scalability when they do. Further, by understanding its potential benefits, they can also better grasp what existing tasks will be automated and eliminated as well as what new opportunities will be created.
It is only when this occurs that enterprise organizations will have confidence that they have the most reliable and highest performing data protection solution in place to protect their rapidly growing digital assets. That technical confidence will then in turn lead to the business scaling with confidence and give them the infrastructure required to successfully make the leap into the 21st century.