Data Center Site: How Deep Can You Go? Efforts to retrofit subterranean bunkers into functional data center space have been underway for years. But as power requirements and security considerations have intensified, selecting underground sites that are specifically designed from day one to house mission critical infrastructure is a new trend. Ultra-secure underground data center space is being fueled by projected energy, security and regulatory benefits relative to above ground alternatives.
Computerworld – Industrial Light & Magic has been replacing its servers with the hottest new IBM BladeCenters — literally, the hottest.
For every new rack ILM brings in, it cuts overall power use in the data center by a whopping 140 kW — a staggering 84% drop in overall energy use.
But power density in the new racks is much higher: Each consumes 28 kW of electricity, versus 24 kW for the previous generation. Every watt of power consumed is transformed into heat that must be removed from each rack — and from the data center.
The new racks are equipped with 84 server blades, each with two quad-core processors and 32GB of RAM. They are powerful enough to displace seven racks of older BladeCenter servers that the special effects company purchased about three years ago for its image-processing farm.
To cool each 42U rack, ILM’s air conditioning system must remove more heat than would be produced by nine household ovens running at the highest temperature setting. This is the power density of the new infrastructure that ILM is slowly building out across its raised floor.
These days, most new data centers have been designed to support an average density of 100 to 200 watts per square foot, and the typical cabinet is about 4 kW, says Peter Gross, vice president and general manager of HP Critical Facilities Services. A data center designed for 200 W per square foot can support an average rack density of about 5 kW. With carefully engineered airflow optimizations, a room air conditioning system can support some racks at up to 25 kW, he says.
At 28 kW per rack, ILM is at the upper limit of what can be cooled with today’s computer room air conditioning systems, says Roger Schmidt, IBM fellow and chief engineer for data center efficiency. “You’re hitting the extreme at 30 kW. It would be a struggle to go a whole lot further,” he says.
[Read our related story, "Why data center temperatures have moderated." Also, read Robert Mitchell's blog post, "Fans: The new power hogs in the data center."]
The sustainability question
The question is, what happens next? “In the future are watts going up so high that clients can’t put that box anywhere in their data centers and cope with the power and cooling? We’re wrestling with that now,” Schmidt says. The future of high-density computing beyond 30 kW will have to rely on water-based cooling, he says. But data center economics may make it cheaper for many organizations to spread out servers rather than concentrate them in racks with ever-higher energy densities, other experts say.
Energy-efficiency tips
Refresh your servers. Each new generation of servers delivers more processing power per square foot — and per unit of power consumed. For every new BladeCenter rack Industrial Light & Magic is installing, it has been able to retire seven racks of older blade technology. Total power savings: 140 kW.
Charge users for power, not just space. “You can be more efficient if you’re getting a power consumption model along with square-footage cost,” says Ian Patterson, CIO at Scottrade.
Use hot aisle/cold aisle designs. Good designs, including careful placement of perforated tiles to focus airflows, can help data centers keep cabinets cooler and turn the thermostat up.
Kevin Clark, director of information technologies at ILM, likes the gains in processing power and energy efficiency he has achieved with the new BladeCenters, which have followed industry trends to deliver more bang for the buck. According to IDC, the average server price since 2004 has dropped 18%, while the cost per core has dropped by 70%, to $715. But Clark wonders whether doubling compute density again, as he has in the past, is sustainable. “If you double the density on our current infrastructure, from a cooling perspective, it’s going to be difficult to manage,” he says.
Name | InfoBunker, LLC | The Bunker | Montgomery Westland | Cavern Technologies | Iron Mountain The Underground |
---|---|---|---|---|---|
Location | Des Moines, Iowa* | Dover, UK | Montgomery, Tex. | Lenexa, Kan. | Butler County, Penn.* |
In business since | 2006 | 1999 | 2007 | 2007 | Opened by National Storage in 1954. Acquired by Iron Mountain 1998. |
Security /access control | Biometric; keypad; pan, tilt and zoom cameras; door event and camera logging | CCTV, dogs, guards, fence | Gated, with access control card, biometrics and a 24x7 security guard | Security guard, biometric scan, smart card access and motion detection alarms | 24-hour armed guards, visitor escorts, magnetometer, x-ray scanner, closed-circuit television, badge access and other physical and electronic measures for securing the mine's perimeter and vaults |
Distance underground (feet) | 50 | 100 | 60 | 125 | 220 |
Ceiling height in data center space (feet) | 16 | 12 to 50 | 10 | 16 to 18 | 15 (10 feet from raised floor to dropped ceiling) |
Original use | Military communications bunker | Royal Air Force military bunker | Private bunker designed to survive a nuclear attack. Complex built in 1982 by Louis Kung (Nephew of Madam Chang Kai Shek) as a residence and headquarters for his oil company, including a secret, 40,000 square foot nuclear fallout shelter. The office building uses bulletproof glass on the first floor and reception area and 3-inch concrete walls with fold-down steel gun ports to protect the bunker 60 feet below. | Limestone mine originally developed by an asphalt company that used the materials in road pavement | Limestone mine |
Total data center space (square feet) | 34,000 | 50,000 | 28,000 plus 90,000 of office space in a hardened, above-ground building. | 40,000 | 60,000 |
Total space in facility | 65,000 | 60,000 | 28,000 | 3 million | 145 acres developed; 1,000 acres total |
Data center clients include | Insurance company, telephone company, teaching hospital, financial services, e-commerce, security monitoring/surveillance, veterinary, county government | Banking, mission critical Web applications, online trading | NASA/T-Systems, Aker Solutions, Continental Airlines, Houston Chronicle, Express Jet | Healthcare, insurance, universities, technology, manufacturing, professional services | Marriott International Inc., Iron Mountain, three U.S. government agencies |
Number of hosted primary or backup data centers | 2 | 50+ | 13 | 26 | 5 |
Services offered | Leased data center space, disaster recovery space, wholesale bandwidth | Fully managed platforms, partly managed platforms, co-location | Disaster recovery/business continuity, co-location and managed services | Data center space leasing, design, construction and management | Data center leasing, design, construction and maintenance services |
Distance from nearest large city | Des Moines, about 45 miles* | Canterbury, 10 miles; London, 60 miles | Houston, 40 miles | Kansas City, 15 miles | Pittsburgh, 55 miles |
Location of cooling system, includng cooling towers | Underground | Underground | Above and below ground. All cooling towers above ground in secure facility. | Air cooled systems located underground. Cooling towers located outside | Chillers located above ground to take advantage of "free cooling." Pumps located underground. |
Location of generators and fuel tanks | Underground | Above ground and below ground | Two below ground, four above ground. All fuel tanks buried topside. | Underground | Underground |