As enterprises adopt public cloud and object storage systems, they will use data fabrics to manage data in these...
new environments alongside on-premises assets. And while there aren't any data fabric tools currently available that completely meet this need, there are several products that deliver parts of these capabilities and will likely expand over time.
Why use data fabrics?
The objective of a data fabric is to streamline data management from disparate sources -- including across on-premises systems and the cloud -- and to ensure data is in the right place, whether it be on premises, in the cloud or in one of multiple data centers -- with the right performance requirements. They also allow data to move as business requirements, and the value of that data, change.
Data fabrics bridge IT department and application owners. The IT department owns the infrastructure, manages physical assets and delivers services. The application owners consume those services -- which is, in this case, storage. While IT defines the available storage services, data fabrics provide application owners with a self-service model to meet their requirements.
For example, a service might be block storage for critical data. That block storage would have a minimum guaranteed performance level, a backup regime and replication to a remote data center. It might also have a retention scheme in which copies are kept on object storage for archival and compliance purposes. A different service might be available for high-performance, but re-creatable, file data that's suitable for analytics. The file share would have fast storage but no backup and replication. This service might copy data from a production system and replicate it to a public cloud, where cheap compute power is available for analytics workloads.
Which storage vendors offer a data fabric for on premises and cloud?
NetApp's Data Fabric ties together on-premises and cloud-based storage. It's a collection of primary and secondary storage systems that can span multiple data centers, both on premises and in the cloud. The data fabric is a software layer that manages data across multiple storage technologies. The overarching idea is to break free from "data gravity," which makes data hard to move, and to have applications run in the most logical location, without having to consider where the data resides.
IoFABRIC Inc. takes a different approach to data fabric. Its appliance, Vicinity, consumes multiple types of storage and re-presents them for consumption. Admins configure the fabric through policies, which drive the locations that store the data. The appliance migrates data between different types of storage and locations based on these policies.
ClearSky Data provides another type of data fabric: a fully managed primary storage as a service. The company has its own high-speed network between its points of presence (PoPs) in a few major cities. Large cache devices are in each PoP. Users deploy smaller cache appliances in their data center with a network link back to the nearest PoP. The appliance delivers storage into your data centers, or to cloud data centers, and enables mobility between them.
Elastifile Ltd. has a software-only system that makes file and block storage available across multiple data centers, including public clouds. Elastifile works with a single, global namespace for data in different locations. Policies then define how data is presented in each location. For example, a folder might be replicated between an on-premises data center and a public cloud, while a different folder is cloned to a new folder and then replicated to a development data center for software testing with production data.
The biggest challenge with data fabrics is that they're still in the early stages of development. Like any new technology, difficulties may arise in terms of vendor options, lack of standards and a general lack of expertise.
How fabrics play a role in software-defined networking
Eliminate cloud silos with data fabrics
Evolve your storage management strategy for the internet of things