Guide to managing data center costs and the IT budget
A comprehensive collection of articles, videos and more, hand-picked by our editors
Your organization upgraded its technology platform to embrace virtualization and cloud computing, and built a new data center to hold the equipment. Now, ask this question: How much is it all worth?
The answer is disappointing. That equipment lost 30% of its value before you opened the box, as soon as it left the vendor's warehouse. All the time and effort you spent provisioning the hardware added no intrinsic value; the face value decreases as you stand there looking at it.
The real value of any IT platform is the information it stores, which is built upon the data created, aggregated, managed, analyzed and reported on. Whereas the value of the hardware decreases minute by minute, the value of the data increases -- and it should be a business imperative to protect that data as much possible.
How available is your data?
The first thing to consider is data availability. If the business can't access the data, it can't turn data into knowledge that guides decisions. Business continuity has to top the priority list, and this requires the use of basic high-availability hardware architectures built around an N+1 or N+M (multiples) hardware redundancy model.
Business-critical data should be mirrored across hardware. The data's significance defines whether it's mirrored in the same data center, across different storage systems on the same campus or across separate data centers in different geographies.
The cloud plays a part as well: Data mirroring used to require a company to operate twin remote data centers or to purchase expensive dedicated space at an external partner. Cloud resources can provide data mirroring at a more reasonable cost.
Next is disaster recovery, where the organization must consider backup and restore. This is no longer just dealing with image copies: The use of snapshots and granular backups of certain data improves the two measures around recovery: the recovery point objective (RPO -- the best point to which you can recover data) and the recovery time objective (RTO -- how long it will take you to recover to that point). These two tend to be linked -- if you need to ensure you are up and running again in the quickest time possible (a short RTO), you may need a worse RPO, and vice versa.
You down with GRC?
Data protection includes governance, risk and compliance (GRC) aspects. A complex mix of data protection laws should be considered in the context of an organization's needs for internal information governance and its responsibilities to its shareholders, suppliers, customers and others. Numerous service companies and software vendors offer discrete solutions for GRC around valuable data. For example, a company in the healthcare market can access any number of HIPAA-compliant systems; likewise, if you are in the financial services market, you will find systems that are CRD IV or Basel IV compliant.
How to choose tools
These industry-specific systems miss the mark -- namely, that your organization needs to deal with its data assets holistically. If personally identifiable data bleeds out from the organization by being used in a bring your own device (BYOD) or shadow IT application, your organization could be liable for brand damage and/or hefty fines.
Aim for a total platform that is information-compliant. Compliance with mixed needs is easier when you focus on the data and ensure that it is secure no matter where it is. The first step is to classify data by need -- labels as simple as public, commercial in confidence and secret may be enough. Actions can be triggered by the type of data being used. Public data can be left as it is -- it makes no difference if this gets out into the public domain. Commercial data and secret data should be encrypted at rest and on the move.
BYOD access points should be controlled so that data is accessed only through secured containers, or sandboxes, where cutting and pasting from the sandbox to the consumer side of the device is blocked. An example of a problem that sandboxes can solve is that of a user accidently choosing the wrong email address when he sends a commercial document. Data leak prevention systems, such as those from Symantec, Fortinet or Blue Coat, can help stop sensitive data from crossing boundaries.
Digital rights management capabilities from vendors such as Adobe or Docurated can help ensure that sensitive information is locked or securely deleted after sitting on a device for a defined period of time.
Document collaboration vendors are upping their game to help create better data protection software. Companies such as Accellion, Huddle and Citric (with Sharefile) build in sophisticated information management functions that work with information stores that aren't databases.
With BYOD, data management tools, such as those built into mobile device management tools from VMware AirWatch, IBM and Symantec, can also ensure that the IT organization can lock or securely delete any data held on a user's device.
The objective in buying data protection tools is that the business will gain as much as possible from the information it collects and analyzes. This requires high availability with rapid disaster recovery capabilities, along with enhanced data security to stop information from leaking into the wrong hands. Aim to create an all-purpose, data-centric, compliance-oriented architecture.