Sunday, January 10, 2010

Data Center cooling

Data center cooling is becoming a continually greater challenge as blade servers make it possible to pack more and more computing power into ever smaller spaces. A standard server cabinet dissipates on the order of 3 kilowatts of power, while partitioned blade servers can dissipate up to 5 times as much. While it is clear that many companies will need to make expensive expansions to their air conditioning systems to accommodate increases in power density, simply increasing air conditioning capacity is not the most efficient way to address this problem.

Air conditioning must be provided where it is needed and the ideal cooling arrangement usually involves the careful management of cool and warm air flows. “Enterprises must … carefully factor in the power and cooling demands of these blades,” says a recent Gartner Group report. “Many organizations will not be able to achieve optimum density because of environmental limitations.” ASHRAE’s Thermal Guidelines for Data Processing Environments (2004) says that temperatures should be maintained between 20 degrees Celsius and 25 degrees Celsius in the server room. As temperatures rise, failure becomes a real possibility, which can lead to downtime and even data loss.

The challenge was especially great in a recent project where the computer room air conditioner (CRAC) units were located in mechanical rooms on either side of the server area. This approach makes it possible to keep the server area locked and sealed even while the CRAC units are being serviced. The room was originally designed so that cold air would flow out of the CRAC units, under the raised floor, and up through perforated tiles into the server area, forming cold aisles between racks of servers. The cold air would then be drawn into the air intakes of the server racks, where it would take on heat from the equipment. The hot air would then exit from the backs of the servers into hot aisles, where it would be drawn up to the ceiling and through perforated panels for return to the CRACs.

One approach to estimating cooling requirements for data centers involves the use of hand calculations to estimate the performance of various alternatives. The problem with these calculations is that they require many simplifying assumptions, thus limiting their accuracy in practical applications. Another approach is the use of energy analysis programs that accept as input a description of the building layout, systems, construction, usage, and utility rates, along with weather data. The limitation of this type of program is that is primarily intended to evaluate energy usage and cost and thus can only predict average temperatures for a particular space. Of course, knowing that the average temperature in the data center is 22 degrees Celsius would be small comfort if the air temperature near one critical server is 29 degrees Celsius.

In the past, about all that engineers could do was to make sure the average temperatures were right and hope that there wasn’t too much variation in the data center. In this building, because of the unique design, there was a real risk that after the building was finished temperatures in the data center would be too high. It would then be necessary to go through a lengthy and expensive trial and error process in order to remedy the problem.

Facilities Engineering Associates (FEA) felt that computer simulation was essential to resolve the unique nature of the cooling problems in this application. CFD can provide enormous assistance by calculating and graphically illustrating the complete airflow patterns, including velocities and distributions of variables such as pressure and temperature. They selected Fluent Incorporated, Lebanon, NH, as consultants to perform the analysis because of Fluent’s leading position in the CFD industry and its experience in addressing heating, ventilation, and air conditioning applications. Their goal was to identify potential problems in the room prior to construction, in order to prevent them from occurring later on, which would require costly downtime for after-the-fact modifications. The process of simulating airflow in a data center has been greatly simplified by the development of Airpak CFD software from Fluent Inc., because it is designed specifically for modeling internal building flows.

Pathlines colored by temperature for the perpendicular rack case, simulated as part of the proof-of-concept phase of the project

Pathlines colored by temperature for the parallel rack proof-of-concept case

Simulating the proof-of-concept designs

The Fluent consultant assigned to the project began by modeling the geometry of the two proposed proof-of-concept designs. The main goal of this analysis was to evaluate the placement of the high density rack relative to the position of the CRAC units in the data center room. In one proposed layout, the CRAC units are parallel to the server racks and in the other design, they are perpendicular. The server and mechanical rooms were defined as boxes, and the raised floor, physical rack locations, CRAC units, suspended ceilings, beams, and tiles were created for each case. One server rack was assigned a higher power density than the others, and the impact of its presence and location in the room for each case was studied. The simulation results provided the air velocities, pressures, and temperatures throughout the server room and lower plenum, and were used to illustrate the resulting air flow patterns for each layout. The global air flow patterns showed that the design worked as intended, with the air rising through the perforated sections of the floor, flowing through the racks, and exiting through the returns in the ceiling.

Zoomed view of pathlines in the vicinity of high-density rack for the perpendicular proof-of-concept case

While the design seemed to be working well at first glance, closer examination showed several serious problems. The figure above shows the flow in the area of the high-density rack, which is positioned next to the wall in the perpendicular orientation case. Cooling air is drawn into the cold aisle between this rack and the adjacent low-density rack through the perforated floor tiles. Following the temperature-coded pathlines, it can be seen that the hot air exiting from the high-density rack circles over the top of the rack and re-enters the cold aisle, where it is drawn into the upper intake portion of the neighboring low-density rack. The lower portion of the low-density rack is not affected. It continues to be properly cooled by air entering through the floor tiles. Temperature contours on the rack intakes (see figure below) further illustrate the problem. A similar problem occurs for the parallel orientation case, where the high density rack is positioned between two normal density racks. Hot air exiting from the high-density rack circles above and around the sides of the rack and re-enters the cold aisle, compromising the cooling air that is delivered through the floor.

Surface temperatures on the rack inlets in the cold aisles for the perpendicular proof-of-concept case

Evening out the heat load

The proof-of-concept simulations demonstrated that the root cause of the problem was with the layout of the high-density racks. It was apparent that the original design concentrated the high load racks in a single row, which created a hot spot in that area. Working within the constraints provided by the client, engineers repositioned the servers to even out the heat load throughout the room. The engineers also addressed the fact that the heat load of the different racks varies widely, from 1 to 7 kilowatts. The original model used the power ratings provided by the blade manufacturers, but the engineers recognized that deviations from the rated values could greatly impact the accuracy of the simulations. They therefore measured the power draw of each unit and discovered that the actual values were considerably less than the rated ones. Using this information, along with information from the simulations, the engineers were able to determine the air conditioning capacity required to cool the room. The calculation was based on the requirement that the failure of any single CRAC unit would not cause the temperature to rise to dangerous levels. Finally, they examined the predicted pressure losses throughout the room and lower plenum to determine the requirements of the fans used to drive the system.

FEA and Fluent engineers also evaluated the effect of the plenum height on the flow patterns in this region. The results showed that a jet of air traveled through the 10.5 inch high plenum and struck the opposing wall, causing a recirculation pattern with nonuniform flow conditions and localized high pressure zones. Assuming that the height of the plenum was the cause of the problem, engineers generated several different models with a plenum of various heights. They discovered that a plenum height of approximately 16.5 inches was the smallest that would promote smooth air flow into the server room. In order to straighten out the airflow through the perforated floor tiles, engineers specified ductwork with an aerodynamic shape that creates a uniform static pressure across the entire space and also absorbs noise.

This application demonstrates that CFD can be used to resolve problematic areas within existing or planned data centers. By using CFD in the design process, potential cooling problems, such as those illustrated in this example, can be identified before they occur, saving the time and expense required for repairs and retrofitting, once the center is on-line.


  1. 我對自己的信心已超越別人對我的評價..................................................

  2. This comment has been removed by the author.


Underground Secure Data Center Operations

Technology based companies are building new data centers in old mines, caves, and bunkers to host computer equipment below the Earth's surface.

Underground Secure Data Center Operations have a upward trend.

Operations launched in inactive gypsum mines, caves, old abandoned coal mines, abandoned solid limestone mines, positioned deep below the bedrock mines, abandoned hydrogen bomb nuclear bunkers, bunkers deep underground and secure from disasters, both natural and man-made.

The facility have advantages over traditional data centers, such as increased security, lower cost, scalability and ideal environmental conditions. There economic model works, despite the proliferation of data center providers, thanks largely to the natural qualities inherent in the Underground Data Centers.

With 10,000, to to over a 1,000,000 square feet available, there is lots of space to be subdivided to accommodate the growth needs of clients. In addition, the Underground Data Centers has an unlimited supply of naturally cool, 50-degree air, providing the ideal temperature and humidity for computer equipment with minimal HVAC cost.

They are the most secure data centers in the world and unparalleled in terms of square footage, scalability and environmental control.

Yet, while the physical and cost benefits of being underground make them attractive, they have to also invested heavily in high-speed connectivity and redundant power and fiber systems to ensure there operations are not just secure, but also state-of-the-art.

There initially focused on providing disaster recovery solutions, and backup co-location services.

Clients lease space for their own servers, while other provides secure facilities, power and bandwidth. They offers redundant power sources and multiple high-speed Internet connections through OC connected to SONET ring linked to outside connectivity providers through redundant fiber cables.

Underground Data Centers company augments there core services to include disaster recovery solutions, call centers, NOC, wireless connectivity and more.

Strategic partnering with international, and national information technology company, enable them to offer technology solutions ranging from system design and implementation to the sale of software and equipment.

The natural qualities of the Underground Data Centers allow them to offer the best of both worlds premier services and security at highly competitive rates.

Underground Data Centers were established starting in 1990's but really came into there own after September 11 attacks in 2001 when there founders realized the former mines, and bunker offered optimal conditions for a data center. The mines, and bunkers offered superior environmental conditions for electronic equipment, almost invulnerable security and they located near power grids.

Adam Couture, a Mass.-based analyst for Gartner Inc. said Underground Data Centers could find a niche serving businesses that want to reduce vulnerability to any future attacks. Some Underground Data Centers fact sheet said that the Underground Data Center would protect the data center from a cruise missile explosion or plane crash.

Every company after September 11 attacks in 2001 are all going back and re-evaluating their business-continuity plans, This doesn't say everybody's changing them, but everybody's going back and revisiting them in the wake of what happened and the Underground Data Center may be just that.

Comparison chart: Underground data centers

Five facilities compared
Name InfoBunker, LLC The Bunker Montgomery Westland Cavern Technologies Iron Mountain The Underground
Location Des Moines, Iowa* Dover, UK Montgomery, Tex. Lenexa, Kan. Butler County, Penn.*
In business since 2006 1999 2007 2007 Opened by National Storage in 1954. Acquired by Iron Mountain 1998.
Security /access control Biometric; keypad; pan, tilt and zoom cameras; door event and camera logging CCTV, dogs, guards, fence Gated, with access control card, biometrics and a 24x7 security guard Security guard, biometric scan, smart card access and motion detection alarms 24-hour armed guards, visitor escorts, magnetometer, x-ray scanner, closed-circuit television, badge access and other physical and electronic measures for securing the mine's perimeter and vaults
Distance underground (feet) 50 100 60 125 220
Ceiling height in data center space (feet) 16 12 to 50 10 16 to 18 15 (10 feet from raised floor to dropped ceiling)
Original use Military communications bunker Royal Air Force military bunker Private bunker designed to survive a nuclear attack. Complex built in 1982 by Louis Kung (Nephew of Madam Chang Kai Shek) as a residence and headquarters for his oil company, including a secret, 40,000 square foot nuclear fallout shelter. The office building uses bulletproof glass on the first floor and reception area and 3-inch concrete walls with fold-down steel gun ports to protect the bunker 60 feet below. Limestone mine originally developed by an asphalt company that used the materials in road pavement Limestone mine
Total data center space (square feet) 34,000 50,000 28,000 plus 90,000 of office space in a hardened, above-ground building. 40,000 60,000
Total space in facility 65,000 60,000 28,000 3 million 145 acres developed; 1,000 acres total
Data center clients include Insurance company, telephone company, teaching hospital, financial services, e-commerce, security
monitoring/surveillance, veterinary, county government
Banking, mission critical Web applications, online trading NASA/T-Systems, Aker Solutions, Continental Airlines, Houston Chronicle, Express Jet Healthcare, insurance, universities, technology, manufacturing, professional services Marriott International Inc., Iron Mountain, three U.S. government agencies
Number of hosted primary or backup data centers 2 50+ 13 26 5
Services offered Leased data center space, disaster recovery space, wholesale bandwidth Fully managed platforms, partly managed platforms, co-location Disaster recovery/business continuity, co-location and managed services Data center space leasing, design, construction and management Data center leasing, design, construction and maintenance services
Distance from nearest large city Des Moines, about 45 miles* Canterbury, 10 miles; London, 60 miles Houston, 40 miles Kansas City, 15 miles Pittsburgh, 55 miles
Location of cooling system, includng cooling towers Underground Underground Above and below ground. All cooling towers above ground in secure facility. Air cooled systems located underground. Cooling towers located outside
Chillers located above ground to take advantage of "free cooling." Pumps located underground.
Location of generators and fuel tanks Underground Above ground and below ground Two below ground, four above ground. All fuel tanks buried topside. Underground Underground
*Declined to cite exact location/disatance for security reasons.