Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Getting the IT and facilities relationship on track

Expert advice on bringing together the IT and facilities departments can improve efficiency and save the company money.

Poor communication and lack of cooperation between departments or groups is certainly nothing new -- every company can probably identify opportunities for better collaboration within the organization. But when IT and facilities groups have trouble working together, the business and its bottom line can really suffer. TechTarget editors sat down with Julius Neudorfer, CTO and founding principal of North American Access Technologies (NAAT), to discuss the underlying problems between IT and Facilities departments and offer some potential solutions.

What do you see happening when IT and facilities staffs can't communicate effectively; what are the real-life consequences for businesses today?

Julius Neudorfer: Unfortunately there is a huge communications gap -- not the least of which is in the physical descriptions of the technologies being discussed -- and this miscommunication does cost money in terms of the time projects take to complete. Sometimes it's incorrectly ordered equipment or facilities that don't match what IT wants. But it's a two-way problem, and even if departments are willing to communicate, the languages don't coincide, and it can impact a project significantly.

What do you see as the reasons for poor communication, and how do you see this communication changing?

JN: There are many reasons for miscommunication -- it's not just an internal issue. For example, vendors do a poor job of communicating systems needs and how that translates to facilities. One of the problems is the "label specs" like power requirements. If IT tells facilities to provide a 10 kVA outlet because the system's specifications call for that, facilities has to provide that level of outlet power even though that may not be the actual running power that the system uses or the amount of heat it throws. Things become over-engineered and overly expensive because everyone focuses on the specifications rather than reality.

Facilities doesn't want to under-rate the outlets and wind up tripping breakers, so they opt for the larger breaker and distribution design, which raises costs. The over-engineered power distribution system then prompts more cooling than might actually be needed, which also adds to the project cost. And even if everything is bought and paid for and installed, its efficiency is very low because there is lots of capacity running under relatively light loads.

Another issue is that IT and facilities often work along vastly different timelines. Facilities might work along a timeline of 10 years or more, while IT may be changing plans every three years or less. Getting the two groups to understand each other's pace is always a challenge.

Fortunately, there is an incentive to change because upper management is getting tired of some of these battles. Some companies adopt a moderator or mediator to help coordinate the two groups and smooth out some of the planning and communication snafus that occur. That may be an internal person or an outside consultant or advisor.

What other factors do you see driving the need for more communication between IT and facilities?

JN: The need for cooling has been a huge driving factor. The traditional approach to cooling was to consider the average watts per square foot of power use. But the use of blade servers and other high-density rack designs has really required some creative and specialized approaches to cooling (such as in-row cooling), and this has put a new focus on communication and cooperation between IT and facilities groups.

The typical blade server needs about 5 kW of power in normal operation. Vendors might show how four blades in a rack can eliminate many racks of discrete computers. But few facilities can cool 20 kW or more per rack, and they certainly can't be cooled effectively using conventional techniques. So both groups have to fundamentally reconsider the way they work together. It doesn’t happen easily or quickly, and continued communication problems can stop a project, cause embarrassing errors or drag it on far longer than necessary.

Some vendors (especially blade server manufacturers) are becoming more proactive in communicating the system's needs to the customer, and even aiding the customer with thermal mapping tools and appropriately sized cooling systems. But it takes a lot of time and effort to focus on proper communication and ensure that the needs of new equipment are properly met.

What kind of real-life business implications can develop as a consequence of poor communication between IT and facilities?

JN: The whole concept of blade servers and virtualization was touted as a way to save energy and organize the data center in a centralized fashion. It works in principle -- but there's often a big gap between the principle and the execution. One client started with a virtualization project, did a pilot project consolidating about 25 servers to one blade server. It was a proof-of-concept and they were experimenting with the software. It worked, they even tested failover behavior; after six months they were pretty comfortable and they decided to move forward. They didn't experience any serious heat problems, though it was a bit warm in the back of the blade cabinets.

So the project was approved and some 30 blade servers came in, and then IT started loading up the cabinets. Facilities had provided extra power for the pilot program, but there was no planning for cooling. After powering up the first six racks or so, the room got horribly warm around the blade servers and they had to be shut down. The project was put on hold while they backpedaled to bring in more cooling. Providing the power wasn't an issue, but there was just not enough infrastructure for cooling; not enough chilled water and so on. There really was a lack of communication between IT and facilities in terms of the real requirements of the fully deployed project. It basically put the project a year and several million dollars behind.

How are organizations improving the communication and coordination between facilities and IT?

JN: One problem is that the systems used by each group are different. Even something as simple as temperature is monitored on different systems. Facilities typically uses a BMS (building management system), which employs its own software and communication protocols and it's not open to TCP/IP or open systems view. So IT puts thermal monitoring in each rack and may use other temperature management software with IP-based controls. At the same time, it's not communicating with the BMS, which has sensors talking to the cooling systems and maybe some temperature-monitoring alarms in the building. But the point is that they're independent of each other, so a BMS might not even sense overheating in the rack because the average room temperature is acceptable. Even when the groups try to coordinate their monitoring efforts, they may not be able to communicate properly because they're looking at different things. To really start communicating, both facilities and IT need to use the same reporting system and understand the significance of what is being measured.

Assigning a permanent liaison between IT and facilities can be helpful, but it really depends on the size of the facilities department. Facilities people are often far more sophisticated if the business is running a dedicated data center. Yet a facilities staff managing a mixed-use building with a 5,000-square-foot data center and a 50,000-square-foot office may not be as sensitive to some of these problems because the emphasis is more often on the building's systems. The overall environment and training level of the facilities staff makes a huge difference here.

How can organizations overcome situations where both sides are just "stuck" and are unable to compromise? How can funding problems be avoided in today's economy?

JN: The trend over the last few years has been that funding is a problem for all types of projects, and it's certainly impacting IT. An analysis of a site may reveal inadequate power or insufficient space -- whether in the physical space or the underlying infrastructure. The cost of a facility renovation is tremendous, and when there's an impasse, organizations are increasingly looking at cloud computing or colocation and managed services. Colocation has become extremely popular due to a lack of capital funding, it's much easier than rebuilding a "live" data center and it requires less time and talent within the organization (even if the funding is available for a renovation). Colocation carries its own set of issues and considerations, but it's a solution when funding isn't there or facilities can't meet the demands.

What impact do you see server virtualization having in facilities planning? What new problems has virtualization created?

JN: Virtualization has been a tremendous benefit for most data centers. If done right, virtualization can reduce the overall power and cooling footprint of a data center. But there are density problems in rack and blade servers that have become a real bane of contention between IT and facilities groups.

The real issue here is that the initial savings brought by virtualization winds up being overcome by a new cycle of computing growth. Within a year or two after a virtualization project is implemented, organizations start buying more blade servers because they have the room and need to run more applications. In effect, the "savings" in power and cooling frees the organization to pursue new computing projects. Before long, the data center fills with newer, high-density equipment, and the problem of inadequate power and cooling resurfaces. Virtualization allows the company to meet their original goals successfully, and that's a good thing, but it can lead to further growth that eventually puts the organization back where they started.

What other ways can organizations ease the burden on facilities? How is the cloud impacting facilities problems?

JN: The cloud does have a direct impact on facilities because it offloads some of the resources that would otherwise be needed for the data center. But the cloud is not for every organization or every application. In some cases, data should not be out in the cloud. Still, a lot of smaller organizations look at the cloud as a meaningful solution that can reduce pressure on facilities. The answer is typically a hybrid approach where only some non-critical applications may be relegated to the cloud, while sensitive or mission-critical applications remain in-house. In some cases, the data center may be mirrored to a colocation site to provide a disaster recovery option rather than having to manage the facilities for a second site directly.

While a higher operating temperature is usually an acceptable means of mitigating cooling needs, only some organizations will push that thermal envelope. Traditional organizations still feel more comfortable with the 68-degree "meat locker." Usually it's the Googles and Yahoos of the world that really dare to flirt with 80-degree operating temperatures. It's not being embraced quite as fast as we might like to see.

What else can organizations do to improve the cooperation between facilities and IT? Are things getting better out there?

JN: I do see an overall improvement in cooperation driven largely by the cost of power. Over the three-year lifespan of a commodity server, the cost of powering and cooling that server now costs more than the server itself. The server that cost $2,000 to put on the production floor may cost an additional $5,000-$6,000 just to power and cool it. IT has become much more aware of those costs. These enormous costs are forcing organizations to consider new power-efficient servers and cooling technologies (along with virtualization to improve computing hardware utilization) to lower power and cooling costs. The challenge is that we're always finding new applications and services to deploy, and that's keeping the power and cooling needs high over time.

This was last published in November 2010

Dig Deeper on Data Center jobs and staffing and professional development

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close