Answer

Business needs in data center redundancy: Beyond software and storage

What are the biggest mistakes in achieving redundant data? How much does IT expertise really affect data center redundancy?

Requires Free Membership to View

No single redundancy plan fits every company's needs. Radically different business demands and regulatory requirements add complexity to data redundancy decisions. Proper hardware and software choices only pay off when IT professionals possess keen insight into the data and its implications for the business.

One of the biggest threats to data redundancy is unclear business goals and lack of IT expertise. It's easy to acquire and deploy tools, but unless those tools are configured properly and used with business policies and objectives in mind, the value of data redundancy diminishes.

Often, IT teams simply protect everything the same way, which can be detrimental to the business. Not all business data is created equal, so protecting all data equally can be costly and inefficient.

For example, of the 10 applications a certain business uses, only five will be important enough to replicate remotely and only two of those may need frequent updates. Replicating all 10 applications frequently will use far more bandwidth and storage than what the business truly needs. IT professionals must understand the data being protected and its value to the business to protect it appropriately.

Regulatory compliance and industry guidelines for data storage and retention affect almost every business. IT professionals must ensure compliance for data center redundancy by working with the corporate compliance officer and legal counsel. For example, some redundant data is subject to a requisite retention period and must be destroyed when that time expires.

Surprisingly few organizations test the recoverability of protected data. Data protection strategies should always include recoverability testing; after all, redundant data is useless if you cannot recover or use it when trouble occurs. Recoverability tests can involve restoring snapshots from the redundant storage array to the main storage array or launching redundant virtual machines on test servers to verify that the workloads are valid. Routinely test as part of your data protection scheme.

Capacity planning must extend to the remote storage subsystem, with management tools and processes to support it. If IT administrators monitor storage use and growth patterns on the remote storage system, they can upgrade capacity as needs evolve, before the remote storage array runs short of storage capacity and causes data protection errors.

This was first published in July 2013

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: