The benefits of trusted computing
According to the Trusted Computing Group's website, the TPM chip was built for security, privacy, interoperability, portability, control, and ease-of-use. Simply put, it aims to max out performance without letting anything fall by the wayside.
Developers designed a chip that would assure the integrity of a platform. Together with the BIOS, the TPM created what is known as the Root of Trust. The TPM contains several Platform Configuration Registers (PCRs) that allow secure storage and reporting of security-relevant data (unauthorized changes to the BIOS, possible root-based modifications, boot-sector changes, etc). This data can be used to detect changes to previous configurations and derive decisions on how to proceed. Microsoft's Bit Locker Drive Encryption functions in this way.
The concerns over trusted computing
Despite the numerous principles regarding the security of trusted computing, the design has raised some concerns over functionality and privacy. In practice, trusted computing uses cryptography to help enforce a selected behavior. The main functionality of trusted computing is to allow someone else to verify that only authorized code runs on a system. Remember, used alone, trusted computing does not protect against attacks that exploit security vulnerabilities introduced by programming bugs.
The problem arises with the core function of the chip. With trusted computing, it is technically possible not just to secure the hardware for its owner, but also to secure it against its owner.
Other similar concerns include the abuse of remote validation of software. In this scenario, the manufacturer—and not the user who owns the computer system—decides what software would be allowed to run. The secondary concern here is that user action in these situations may be recorded in a proprietary database without the user actually knowing. With this happening, user privacy becomes and issue as well as possibly creating a security compliance conflict.
TPM in server technology
Many large sever vendors sell TPM-ready machines. Still, the same cautions as above must be taken when deciding to use a TPM. These same large vendors will go on to warn their customers that the TPM is a customer-configured option. Server makers like HP take their own cautions with the chip, saying that they will not configure the TPM as part of any pre-installation process. HP even makes the point of mentioning that it isn't responsible for TPM locking users out. HP recommends backing up keys and server data before using TPM on a server level.
BitLocker, for example, will lock access if an error is made during a wide variety of procedures. HP lists "updating the system or option firmware, replacing the system board, replacing a hard drive, or modifying OS application TPM settings." In other words, there's a lot of room for error.
Caution is strongly advised when deploying the TPM within a server. Make sure there is a viable use case for this technology, as any mistake can be very costly.
Securing your machine
The topic of trusted computing will continue to draw criticism and support. When used as designed, the chip can certainly provide a higher level of machine security. However, abuses and functionality questions highlight the drawbacks to adopting the technology.
Remember, computer security does not have to be chip-reliant. Security best practices can help guide administrators in the right direction if they feel uncomfortable using the TPM chip. Ensuring a system’s BIOS settings are correct, its firmware and software is up to date and constantly monitoring an environment’s security health will keep systems running longer and safer. Each data center is unique and has different requirements. It will be through careful planning and research that an IT administer will be able to come properly secure their infrastructure.
About the author:
Bill Kleyman, MBA, MISM, is an avid technologist with experience in network infrastructure management. His engineering work includes large virtualization deployments as well as business network design and implementation. Currently, he is the Virtualization Architect at MTM Technologies Inc. He previously worked as Director of Technology at World Wide Fittings Inc.
This was first published in October 2011