IT industry standards body the Distributed Management Task Force (DMTF) created the Configuration Management Database Federation (CMDBf) specification on July 21, 2009. The CMDBf specification can help organizations more easily integrate CMDB data from multiple sources and enable more diversity in terms of CMDB tool sets and vendors.
Configuration items come from multiple sources
In the IT Infrastructure Library (ITIL) v2, we came to understand that the configuration items (CIs) tracked in the configuration management database (CMDB) could literally be anything -- hardware, software, documentation, people, facilities, services and -- very importantly -- their relationships to one another.
In ITIL v2, the CMDB was initially spoken of in a singular manner until people realized that because it is a database, the rules of data normalization apply and integration to other data sources is needed.
Thus, the industry coined the term "federated CMDB" to describe CMDBs that linked multiple databases together. With that integration lay both opportunities and challenges. On one hand, specific data could exist in the correct authoritative system of record, which was good, but those integrations needed to be manually programmed and maintained, which was, and still is, the painful part.
Responding to pressures to reduce integration costs, the DMTF created the CMDBf specification. The specification was assembled by BMC, CA, Fujitsu, Hitachi, HP, IBM and Oracle, and addresses the federation of Management Data Repositories (MDRs) to enable a configuration management system (CMS) through the use of Web services. The specification doesn't get into analysis and use of data but does detail how they will be shared through the use of query and registration services that exchange XML. This approach should allow for far more automated data exchanges versus the old methods of adapters and programming for each discrete database integration.
By pursuing a Web services model, vendors that truly adopt the specification will enable customers to have lower costs of ownership and pursue more best-of-breed approaches versus being constrained to stay within a single vendor's suite of products. These two benefits require that that the CMDBf specification is adopted by a sufficient number of vendors, notably those relevant to a given IT organization. However (and there is almost always a catch somewhere), this will not address many of the hurdles organizations face today with their CMS implementations -- in fact, the increased automation may just lead to faster and more efficient failure for firms that aren't careful.
My comments are not to diminish the value of the specification. If adopted by IT Service Management (ITSM) tool vendors, the spec will reduce integration headaches and costs. For organizations to truly reap the benefits of a Web services-based CMS or a hybrid CMS from integrated MDRs that follows the CMDBf specification as well as traditional integration methods, there are some fundamental issues that need to be addressed.
The importance of SACM processes for CMSes
First and foremost, a CMS is a database that supports the service asset and configuration management (SACM) process tasked with managing the logical view of IT's world. IT organizations continue to try to buy their way into ITSM by purchasing products, but it simply does not work this way. ITSM is about understanding the requirements of the business and delivering IT services that meet those needs. A CMS without the support of SACM process risks failure and suboptimal outcomes since there will not be defined roles and responsibilities, accountability, data auditing, and so on. For example, what would happen if a CMS is built without taking into account proper planning and architectural considerations?
Efficient change management for CMS updates
Second, to maintain accuracy, the updating of CMS data must be governed by an effective and efficient change management process. Without change management, there is a risk of data not being accurate. If the data isn't timely and accurate, a death spiral may begin: Due to errors and a lack of timeliness, users stop using the CMS, leading the to CMS become more error prone and out of data, which in turn causes even fewer users to rely on it, and down it goes with all the time and money invested. Regardless of the method, the need for integration still exists.
A culture of continuous improvement
Third, IT organizations must foster a culture of continuous improvement, including ITIL's Continuous Service Improvement process. This need pertains to people, processes and supporting technology -- everything that goes into the services that IT provides. SACM and the CMS will be a journey that requires a formally managed approach to ensure the needs of the business are met -- and continue to be met over time. The CMDBf specification will an integration enabler, but it can't guarantee process improvement.
CMS architecture is based on the value of its data
Fourth, the SACM process and underpinning CMS data must constantly strive for relevancy and sustainability. It is very easy to take a technical approach and try to track everything with minute detail regardless of actual value. As a result, data entry takes too long, users become frustrated, auditing takes longer and before you know it the death spiral begins. The architecture of the CMS must be based on value, and the data must be relevant and such that it can be maintained over time. In some instances the question may not be "Can we connect?" but "Should we connect?" The CMDBf specification can absolutely give IT access to data, perhaps even more data, but questions must be asked about what is truly needed.
Define relationships between all configuration items
Fifth, firms still aren't effectively defining and using data relationships. It is imperative that the relationships between the various CIs that make up a service are defined. This data is critical for all other processes. Change management needs it for impact assessments. Capacity and availability need the relationship data to effectively plan and monitor. Incident needs this data for performing triage activities. Event management, one of the most promising processes of ITIL v3, needs it for effective monitoring and rule execution. The list goes on and on. Some organizations are getting marginal data with automated dependency tools, but a concerted effort is needed to include all relevant CIs.
Understanding process integration within the CMS
Last but not least, process integration with data exchanges need to be understood. Too many isolated silos exist today, violating data normalization rules and causing errors due to inconsistent updating and so on. Organizations need to understand and manage their data requirements in the CMS instead of letting them organically evolve.
In summary, the CMDBf specification will undoubtedly help organizations reduce the costs of integrating management data repositories and enable more diversity in terms of tool sets and vendors. The more vendors that adopt the standard -- and the faster they do it -- the better off customers will be when it comes to integration. To truly gain value from their CMSes, however, IT organizations must establish effective and efficient SACM, change management and Continuous Service Improvement processes as well as focus on the need to have relevant and sustainable data that is an accurate and timely logical view of the IT world. As always, the tools need to support the processes, and it's the latter that is the largest stumbling block for IT organizations to address right now.
ABOUT THE AUTHOR: George Spafford http://www.spaffordconsulting.com/ is a Principal Consultant with Pepperweed Consulting and an experienced practitioner in business and IT operations. He is a prolific author and speaker, and has consulted and conducted training on regulatory compliance, IT governance and process improvement.
What did you think of this feature? Write to SearchDataCenter.com's Matt Stansberry about
your data center concerns at firstname.lastname@example.org.
This was first published in August 2009