Sunday, January 10, 2010

Data-center power glossary

The EPA energy efficiency report

Servers and data centers make up a significant portion of the nation’s energy consumers, and a substantial share of this consumption is due to federal data centers. Therefore, the government is taking steps to analyze and better manage the electricity use of its data centers. In 2007, the Environmental Protection Agency (EPA) published a study examining the energy use of data centers in the United States (EPA Report 109-431). It found that:

    Server rooms/data centers consumed about 1.5% of all electricity (or about 61 billion kWh) in the U.S. in 2006, at a cost of around $4.5 billion.

    Federal server rooms/data centers accounted for 10% (or 6 billion kWh) of this electricity at a cost of $450 million annually.

    Blade servers consumed 68% of the electricity used for IT equipment in data centers in 2006.

    The amount of energy Blade servers use more than doubled between 2000 and 2006.

    About 38% of electricity use is due to enterprise-class data centers and those growing most rapidly.

The EPA’s analysis shows that the energy costs in data centers arise not only from energy-hungry servers, but also from the cooling infrastructure necessary to support IT equipment.

Data Center Power Draws: a pie chart

Illustration 1: Data Center Power Draws

Data centers today are forced to redefine capacity from terms of space to terms of cooling and power. While efficient blade servers solve issues of space, they raise issues of cooling (by creating intense hot spots) and even the most power-efficient server can strain the electricity bill due to increased cooling demands (see also Energy Efficient Equipment in the Data Center in this issue).

Compared to 2000, the overall use of energy by U.S. servers and data centers more than doubled in 2006. If efficiency standards remain unchanged, the EPA report projects that national energy use by data centers could double again in the next five years. Combined with climbing energy prices, this means that energy needs will claim ever larger shares of data centers’ budgets, leaving less money for other areas such as expansion, new equipment, or general improvements.

Energy efficiency is a budget mandate as much as it is an environmental issue.

If you thought IT acronyms were hard to remember, wait until you sit down with your facilities team to discuss your data center's electric bill. You need to learn a whole new vocabulary when you start talking about lowering the building's energy use.

Here's a crib sheet of a dozen of the most commonly used energy terms and acronyms so you can learn the jargon for going green.

1. AC/DC

Yes, this is the name of Australia's greatest rock band, but it's also a key trend in data-center design. AC stands for alternating current, and DC stands for direct current. Leading-edge data-center designers are looking at power supplies based on DC power -- rather than today's AC power -- because DC power promises to be more energy efficient.

2. Carbon footprint

No relation to Sasquatch, although to corporate executives it can be an equally large and scary beast. A company's carbon footprint is the amount of CO2 emissions its operations produce. In setting goals to reduce their carbon footprint, many companies target their data centers because they consume 25% or more of the electric bill.

3. CFD

It sounds like the acronym for the Chicago Fire Department, but this version stands for computational fluid dynamics. CFD high-performance-computing modeling has been used for a long time in the design of airplanes and weapon systems. Now it's being applied to air flow in data centers for optimal air-conditioning design.

4. Chiller

This isn't what you drink at the beach on a hot day. Rather, it's a machine that uses chilled water to cool and dehumidify air in a data center. Of all the components of a data center's air conditioning system, this is the one that consumes the most amount of electricity -- as much as 33% of a data center's power.

5. Close-coupled cooling

This sounds like a technique that would come in handy on Valentine's Day. In fact, it's a type of data-center air-conditioning system that brings the cooling source as close as possible to the high-density computing systems that generate the most heat. Instead of cooling down the entire room, close-coupled cooling systems located in a rack cool the hot air generated by the servers in just that rack.


This is not what you sometimes see when a plumber bends over, although it's pronounced the same way. We're talking about a computer-room air-conditioning system. CRAC units monitor a data center's temperature, humidity and air flow. They consume around 10% of a data center's power.

7. DCiE

This acronym has nothing to do with the nation's capital, although its pronunciation is similar. DCiE is the Data Center Infrastructure Efficiency metric (also called DCE for Data Center Efficiency). DCiE is one of two reciprocal metrics embraced by The Green Grid industry consortium; the other is Power Usage Effectiveness (PUE, below).(See "Two ways to measure power consumption.") DCiE shows the power used by a data center's IT equipment as a percentage of the total power going into the data center. A DCiE of 50% means that 50% of the total power used by a data center goes to the IT equipment, and the other 50% goes to power and cooling overhead. The larger the DCiE, the better. 8. kWh

Electric power is sold in units called kilowatt hours, 1 kWh is the amount of energy delivered in one hour at a power level of 1000 watts. This abbreviation for "kilowatt hour" is mostly used in writing rather than conversation.

9. PDU

The acronym PDU stands for power distribution unit, a device that distributes electric power. PDUs function as power strips for a data center and consume around 5% of the power in a typical center.

10. PUE

Not pronounced like the reaction to a bad odor, but one letter at a time. Power Usage Effectiveness is one of two reciprocal metrics embraced by The Green Grid industry consortium; the other is Data Center Infrastructure Efficiency (DCiE, above). PUE is the ratio of the total power going into a data center to the power used by the center's IT equipment. For example, a PUE of 2 means that half of the power used by the data center is going to the IT equipment and the other half is going to the center's power and cooling infrastructure. Experts recommend a PUE of less than 2. The closer a PUE is to 1, the better.

11. RECs

Pronounced like the short version of the word recreation, this acronym means renewable energy certificates or renewable energy credits. RECs are tradable commodities that show that 1 megawatt-hour of electricity was purchased from a renewable source, such as solar, wind, biomass or geothermal. An increasing number of companies are buying RECs to offset the amount of electricity generated from fossil fuels that their data centers consume.

12. UPS

We're not talking about the boys in brown, although the acronym is pronounced the same way. We're talking about uninterruptible power supply, which provides battery backup if a data center's power fails. It's essential that UPS equipment be energy efficient, because it consumes as much as 18% of the power in a typical data center.

No comments:

Post a Comment

Underground Secure Data Center Operations

Technology based companies are building new data centers in old mines, caves, and bunkers to host computer equipment below the Earth's surface.

Underground Secure Data Center Operations have a upward trend.

Operations launched in inactive gypsum mines, caves, old abandoned coal mines, abandoned solid limestone mines, positioned deep below the bedrock mines, abandoned hydrogen bomb nuclear bunkers, bunkers deep underground and secure from disasters, both natural and man-made.

The facility have advantages over traditional data centers, such as increased security, lower cost, scalability and ideal environmental conditions. There economic model works, despite the proliferation of data center providers, thanks largely to the natural qualities inherent in the Underground Data Centers.

With 10,000, to to over a 1,000,000 square feet available, there is lots of space to be subdivided to accommodate the growth needs of clients. In addition, the Underground Data Centers has an unlimited supply of naturally cool, 50-degree air, providing the ideal temperature and humidity for computer equipment with minimal HVAC cost.

They are the most secure data centers in the world and unparalleled in terms of square footage, scalability and environmental control.

Yet, while the physical and cost benefits of being underground make them attractive, they have to also invested heavily in high-speed connectivity and redundant power and fiber systems to ensure there operations are not just secure, but also state-of-the-art.

There initially focused on providing disaster recovery solutions, and backup co-location services.

Clients lease space for their own servers, while other provides secure facilities, power and bandwidth. They offers redundant power sources and multiple high-speed Internet connections through OC connected to SONET ring linked to outside connectivity providers through redundant fiber cables.

Underground Data Centers company augments there core services to include disaster recovery solutions, call centers, NOC, wireless connectivity and more.

Strategic partnering with international, and national information technology company, enable them to offer technology solutions ranging from system design and implementation to the sale of software and equipment.

The natural qualities of the Underground Data Centers allow them to offer the best of both worlds premier services and security at highly competitive rates.

Underground Data Centers were established starting in 1990's but really came into there own after September 11 attacks in 2001 when there founders realized the former mines, and bunker offered optimal conditions for a data center. The mines, and bunkers offered superior environmental conditions for electronic equipment, almost invulnerable security and they located near power grids.

Adam Couture, a Mass.-based analyst for Gartner Inc. said Underground Data Centers could find a niche serving businesses that want to reduce vulnerability to any future attacks. Some Underground Data Centers fact sheet said that the Underground Data Center would protect the data center from a cruise missile explosion or plane crash.

Every company after September 11 attacks in 2001 are all going back and re-evaluating their business-continuity plans, This doesn't say everybody's changing them, but everybody's going back and revisiting them in the wake of what happened and the Underground Data Center may be just that.

Comparison chart: Underground data centers

Five facilities compared
Name InfoBunker, LLC The Bunker Montgomery Westland Cavern Technologies Iron Mountain The Underground
Location Des Moines, Iowa* Dover, UK Montgomery, Tex. Lenexa, Kan. Butler County, Penn.*
In business since 2006 1999 2007 2007 Opened by National Storage in 1954. Acquired by Iron Mountain 1998.
Security /access control Biometric; keypad; pan, tilt and zoom cameras; door event and camera logging CCTV, dogs, guards, fence Gated, with access control card, biometrics and a 24x7 security guard Security guard, biometric scan, smart card access and motion detection alarms 24-hour armed guards, visitor escorts, magnetometer, x-ray scanner, closed-circuit television, badge access and other physical and electronic measures for securing the mine's perimeter and vaults
Distance underground (feet) 50 100 60 125 220
Ceiling height in data center space (feet) 16 12 to 50 10 16 to 18 15 (10 feet from raised floor to dropped ceiling)
Original use Military communications bunker Royal Air Force military bunker Private bunker designed to survive a nuclear attack. Complex built in 1982 by Louis Kung (Nephew of Madam Chang Kai Shek) as a residence and headquarters for his oil company, including a secret, 40,000 square foot nuclear fallout shelter. The office building uses bulletproof glass on the first floor and reception area and 3-inch concrete walls with fold-down steel gun ports to protect the bunker 60 feet below. Limestone mine originally developed by an asphalt company that used the materials in road pavement Limestone mine
Total data center space (square feet) 34,000 50,000 28,000 plus 90,000 of office space in a hardened, above-ground building. 40,000 60,000
Total space in facility 65,000 60,000 28,000 3 million 145 acres developed; 1,000 acres total
Data center clients include Insurance company, telephone company, teaching hospital, financial services, e-commerce, security
monitoring/surveillance, veterinary, county government
Banking, mission critical Web applications, online trading NASA/T-Systems, Aker Solutions, Continental Airlines, Houston Chronicle, Express Jet Healthcare, insurance, universities, technology, manufacturing, professional services Marriott International Inc., Iron Mountain, three U.S. government agencies
Number of hosted primary or backup data centers 2 50+ 13 26 5
Services offered Leased data center space, disaster recovery space, wholesale bandwidth Fully managed platforms, partly managed platforms, co-location Disaster recovery/business continuity, co-location and managed services Data center space leasing, design, construction and management Data center leasing, design, construction and maintenance services
Distance from nearest large city Des Moines, about 45 miles* Canterbury, 10 miles; London, 60 miles Houston, 40 miles Kansas City, 15 miles Pittsburgh, 55 miles
Location of cooling system, includng cooling towers Underground Underground Above and below ground. All cooling towers above ground in secure facility. Air cooled systems located underground. Cooling towers located outside
Chillers located above ground to take advantage of "free cooling." Pumps located underground.
Location of generators and fuel tanks Underground Above ground and below ground Two below ground, four above ground. All fuel tanks buried topside. Underground Underground
*Declined to cite exact location/disatance for security reasons.